Avoiding Gel-Mann amnesia tells me the media is often wrong. But the media isn't infinitely wrong (for example, I do believe there is a person named "Vladimir Putin", and he may even be Russian). So how do I know how much to trust vs. distrust the media. Just noticing that Gel-Mann amnesia is real and important doesn't answer that for me.
I'm not sure how true that is for people who are familiar with unusual things that reporters would have much less knowledge of than the average thing they write about.
Seemingly ... In this case they are less wrong about the facts, and more wrong about the interpretation. This is less bad, because one can come up with one's own interpretation. We should update our assessment of the media accordingly.
I mean, we're you unaware that Vladamir Puttin existed? Are there other sources you could get this information from? What does the current media give you that isn't provided better from another source?
For example, do you think you'd be better informed about Russia by reading the news or by reading, say, State Department press releases? What does the media do better than government press releases, other than making it more entertaining and more hateful?
I mean, from a sufficiently abstract point of view, all of those "sources" are the media. They're all intermediates between what happens and you. If you want to learn about things outside your immediate experience, you'll have to trust at least *some* media. For certain things it makes sense that the media you trust are the big professional organizations.
Also, to answer your specific question, what the big professional media do better than the government is not being the government, and therefore providing an alternate viewpoint that can run contrary to whatever hidden motives the government may have in showing you that press release.
I'm...not sure I catch this vibe and it feels very...abstract. Very concretely:
You can get your information from social media, like Twitter, which is occasionally extremely good and 90+% of the time utterly horrific.
You can get your information from news media, like the NYT, which has a long and detailed history of yellow journalism and outright deception.
You can get your information from government sources, like the FRED blog, which has hidden government motives.
Or, more realistically, you're consuming some mix of all of the above, plus a host of other sources. Why should the news media, a group of organizations with a really long history of bad behavior and lying be in that mix? If you're scanning a few good Twitter accounts, subscribed to some substacks, have some podcasts, you're already consuming a ton of media. Within that environment, why shouldn't you just stop paying attention to the NYT on, say, abortion and just go listen to the Dobbs oral arguments (1)?
I don't really have thoughts about the NYT, which I don't read and never regularly have. But professional news media does tend to have certain properties that are interesting, such as access to certain places that a random substacker probably wouldn't get to in high-level politics or business, etc. They also tend to have a lot of resources, which also gives them capabilities that smaller sources don't have. And they're professionals at synthesizing information in a digestible way, which is often more practical than reading the original sources.
There's also the sense in which, yes, news media have perverse incentives to maximize their readership, reputation, and revenue to the detriment of truth, but paradoxically you can trust them somewhat *more* knowing this. Compare a state-owned newspaper with a private for-profit one. Both distort information, but the for-profit one (maybe!) doesn't have malicious hidden motives, they just want to make more money.
All that said, I totally agree that a lot of the mainstream media isn't really trustable, and if you have reasons of your own not to trust an entire org such as the NYT, banning them from your media mix is perfectly reasonable.
>I mean, we're you unaware that Vladamir Puttin existed?
I certainly would be if I ignored all media! I've never met him in person, nor have any of my close acquaintances, so where else would I have heard the name?
I think the best solution I have found is to locate a conversation, typically online, where intelligent people with a wide range of political (and perhaps religious and scientific) beliefs argue with each other. You read a conservative who you have found to be intelligent writing about some issue and you probably have the best arguments for the conservative view of that issue. You read a leftist similarly. You then evaluate the arguments for yourself.
Like it couldn't decide what it's argument was other than to diss EA. I mean, if there's this bad rich guy then surely the more of his cash you divert to helping ppl the better right? And it acted like flattering rich people's ego's was somehow a particular problem for EA as if fancy thousand dollar a plate dinners, offering large donors leadership positions and otherwise fund raising by via flattery wasn't a thing for every last charity.
I have some theories about why this happens but I'll put that in a seperate comment.
Thing is, it wasn't *his* cash but "all the people who invested with my fraudulent scheme" cash. That's what is getting him in trouble.
I have no particular animus against EA but having a prominent swinder (let's just slap "alleged" all over this comment so that lawyers don't come after Scott, it's not him it's me your honour) rook money out of people in part by playing the charity card is going to look bad to people outside.
Take this from the Wall Street Journal, representative of the media coverage:
"Run by self-described idealists spending the wealth of their billionaire patron to make the world a better place, Mr. Bankman-Fried’s FTX Foundation and its flagship Future Fund touted deep pockets, ambitious goals and fast turnarounds.
Now Mr. Bankman-Fried’s fortune has disappeared, and the self-described philosopher-executives running the organizations have resigned. Grant recipients are scrambling for cash to plug the shortfall and fretting about the provenance of FTX’s largess after the company’s lawyers said this week that a “substantial amount” of assets were missing and possibly stolen.
Mr. Bankman-Fried often claimed philanthropy was his primary motivation for amassing a fortune. “It’s the thing that matters the most in the end,” he said in an April interview on the “80,000 Hours” podcast.
Will MacAskill, then a philosophy graduate student, pitched Mr. Bankman-Fried on the idea of effective altruism, a way of applying some utilitarian ideas to charitable giving.
“Effective altruists say you should use reason to compare causes and find the thing that can get you the highest return,” said Stanford University political science professor Rob Reich. “Giving to [elite universities] or the art museum is a much lower return on your charitable donation than giving to an antimalaria nonprofit that looks to prevent easily preventable deaths.”
Mr. Bankman-Fried had considered different career paths, he said in the “80,000 Hours” interview, but Mr. MacAskill suggested he could do the most good by making a lot of money and giving it away, a popular idea in the community.
Mr. Bankman-Fried set his sights on crypto, founding trading firm Alameda Research in 2017. He launched FTX a few years later. The price of bitcoin and other digital currencies surged, helping FTX become one of the world’s five biggest crypto exchanges. He soon helped establish the FTX Foundation, Future Fund and a family foundation, Building A Stronger Future. "
This is rather sympathetic coverage, but somebody could come away from that wondering how the heck this altruistic philosophy apparently left out the bits about "don't steal, don't cheat, don't swindle and don't lie". At best, he wanted to get rich quick to do good and took a lot of ill-advised shortcuts to do that. At worst, he used the high-falutin' notions as a cover for stealing.
Now EA has to deal with the fallout of "okay, one rotten apple". I mean, I'm Catholic, welcome to the way media coverage works when there is a huge scandal associated with you and yours, especially when morals/ethics/principles are involved.
Probably unfair of me because like I said, I don't have any particular animus about EA, but I'm enjoying all this coverage because, well, that was a juicy one, exile! It's great fun to read a big story like this when it's not your ox being gored.
Well Scott's safe under DMCA section 230. To address your main argument I certainly agree that EA needs a way broader donor base and to get less of it's money from one or two billionaire's who might turn out to be unsavory (tho it still seems way better that they give even stolen money to charity than spending it on themselves).
But that's what's so frustrating about the coverage. If the argument was that EA is too reliant on a few big donors well fair criticism (tho not clear how actionable it is). But the arguments always seem to slide from: EA is too associated with a few big sketchy donors to an implication that the cause itself is somehow suspect.
And it's always been the way that sketchy rich ppl launder their public images by charitable donations and whenever it happens with other charities the attitude taken seems much more like mine: well it's good that money went to great causes.
Sure, if there were a bunch of givewell ads or propoganda featuring Sam then that criticism would be fair but it seems like what happened is that the journalists latched on to Sam's self-promotion as a big EA person (getting no more than the usual official recognition of big donors from EA institutions) and is now acting like EA should have refuses the funds or somehow stopped the journalists from making the association.
Yeah, it definitely seems like SBF and the FTX implosion are just a hook to regurgitate the same arguments that people were making about EA a month ago (and the month before that, and ...). Like "this rich guy who was a prominent EA donor and advocate turned out to just be running a giant Ponzi scheme which just goes to show why my argument that we can't know anything about the world so trying to figure out what is most "effective" will lead us to disaster is completely correct for some reason".
Yeah, guys, but you still have to address the argument - and it may be an unfair argument, but this is the rod EA cut for its own back - that "EA was all about doing good better. That is, they were going around telling everyone that all the other charitable stuff was rubbish, everyone else was doing it wrong, you should put the welfare of strangers over those nearer to you, and it was all backed up by statistics and the only correct philosophy. Well, *now* look what your star pupil has gone and done. But *I'm* the idiot for putting ten bucks in the collection plate in church instead of handing it over to you lot?"
EA says 'we've figured out a better way to do good'. EA then gets mixed up with a guy who was conning everyone out of their socks. EA wasn't so smart, was it? And if it says it can figure out things better than ordinary people because it doesn't let emotional attachment or sentimental associations affect what the cold equations of efficacy tell it, then it should be held to a higher standard. But it turns out the EA movement was just as easily suckered as the ordinary guy in the street, so yah boo sucks to you!
> That is, they were going around telling everyone that all the other charitable stuff was rubbish, everyone else was doing it wrong
I mean, no it wasn't. Maybe you have an example of someone in the EA world saying this but I have never seen anything that can be interpreted in that way unless you take the position that literally suggesting anything is equivalent to saying anything you didn't suggest is garbage.
> Well, *now* look what your star pupil has gone and done
Rich guy who very publicly gave a lot of money to charity and used it for PR turned out to be a scumbag. Not exactly some novel failure of EA in particular. I mean charities accept donations from people. Sometime those are bad people. Sometimes those bad people use their charitable giving to launder their reputation. It's bad and I think the EA community should absolutely learn some lessons from this fiasco, but sort of mundane lessons about the relationship between any institution and it's stakeholders.
> EA movement was just as easily suckered as the ordinary guy in the street, so yah boo sucks to you!
Were they suckered? Did they claim any particular expertise in evaluating the solvency of cryptocurrency exchanges? Almost every one of the world's most sophisticated investors got duped by SBF and FTX so I don't see it as particularly discrediting or surprising that a bunch of guys who run charities failed to crack the case.
More to the point, does any of this change our evaluation of how effectively they were actually using SBF's money? Not sure I see how.
The unfairness of the coverage is more to do with the assumptions about the influence EA and/or utilitarianism had on Bankman-Fried.
Utilitarianism is taking quite the bashing, even more so than EA. The same breathless, syrupy admiration about this being his motivating philosophy is very easily turned around into condemnation.
Utilitarianism helped 12 year old Sam figure out that his parents, his friends, and all those around him who were pro-abortion were right! How wonderful!
But that same utilitarianism also helped 30 year old Sam figure out that robbing Peter to pay Paul was perfectly okay, too. How dreadful!
I do feel sorry for Will MacAskill, who is getting portrayed as Bankman-Fried's guru - and when the pupil goes astray, you have to look at what was the master teaching them? (But not *too* sorry; he's the guy who, as one comment in another thread suggested, said he didn't have kids because they would interfere with his work. That work that is so important for the world. Well, now him and his work are getting a black eye all over the place, makes the distractions a couple of kids might have caused him look a lot better, eh, Will?)
"Will MacAskill, then a philosophy graduate student, pitched Mr. Bankman-Fried on the idea of effective altruism, a way of applying some utilitarian ideas to charitable giving.
Mr. Bankman-Fried had considered different career paths, he said in the “80,000 Hours” interview, but Mr. MacAskill suggested he could do the most good by making a lot of money and giving it away, a popular idea in the community. "
If he doesn't want kids because he doesn't like kids, that's a perfectly fine reason. Phrasing it as "I am Too Important, my time is Too Valuable, the world itself would suffer were I to be distracted by the demands of fatherhood" just sounds like you are an insufferable swelled-head with delusions as to your ultimate grandeur. You could be hit by a bus tomorrow, Will, and the world would stagger on without you.
Will MacAskill literally writes in his book 'What We Owe the Future' that one of the things people should consider to make the future better, is to have children.
I don't know whether he personally has plans to have kids, but the idea he thinks "I am Too Important, my time is Too Valuable, the world itself would suffer were I to be distracted by the demands of fatherhood" is a straw man in the extreme. Where does this idea come from?
"You could be hit by a bus tomorrow, Will, and the world would stagger on without you." This is an incredibly poor taste thing to write on a public forum and I read it as very close to "I wish you dead". It's the kind of comment I'd like to see moderated but I'm new here - is this kind of thing acceptable here?
What's wrong with "things other than kids fill my time in ways I find meaningful"? Things go wrong with work, things go wrong with kids too. A lot of people have really dissapointing relationships with their kids.
The cause is suspect because it is a cause that appeals to sociopaths. Treating the life of some child in the Congo as of equal value to the people in your neighborhood is sociopathic behavior. Or whatever label you want to slap on it. It as a philosophy that appeals to people high on rationality and low on traditional human connections and values. And I can identify it easily because I am one of them.
That may be part of resentment but i don't think that explains it all. You also need the feeling that they are saying they are better than you and that it feels plausible enough to make you feel a bit bad about it.
I think most ppl don't feel the congo kid has equal value but feel really bad about saying that out loud because of the values we hold up as a society. EA's are those weird ppl that are kinda creepy who force them to face this tension and that creates resentment.
Like I think cryo or belief we live in a simulation has a similar profile for who talks about it but tends to just generate a feeling of "look at those weirdos" (unless u are in a personal rel with someone who believes) and maybe crit for wasting money but not the same schredenfraud (not even going to try spelling it right) bc it doesn't create any feeling of guilt or self-doubt.
Ah yes, the classic sociopath behaviour of... making significant personal sacrifices in order to help others?
I know you said 'or whatever label...', but 'sociopath' is too loaded a word to radically redefine and then apply to people who don't fit the normal definition. Caring about others and taking altruistic actions to help them is about as far from sociopathic behaviour as you can get. (And most non-EAs do very little to help the people in their neighbourhood anyway.)
>Ah yes, the classic sociopath behavior of... making significant personal sacrifices in order to help others?
I would rather frame it as the classic sociopath behavior of so little valuing your actual human connections and relationships that you can somehow reason yourself into the position that distance (physical and/or social) doesn't matter ethically. its the position of someone who doesn't have normal human feelings.
>most non-EAs do very little to help the people in their neighborhood anyway
Most everyone does. But not most people fired up about helping others, which is the group of people we are taking about and EAs are trying to proselytize to.
Wait, how does donating 10% of your income to global health charities trade off at all against having warm relationships with your friends and wider community? I think a randomly selected EA would mostly likely advise me to be doing both. Are you sure you're not arguing against a straw man here? I will admit to thinking that physical distance has no intrinsic ethical relevance, but it's strongly correlated with social distance, which is indeed ethically relevant, as you pointed out.
I’m reminded of the “stochastic terrorism” line of reasoning. Some ideas are just so dangerous we can’t responsibly speak them out loud: trying to do good most efficiently, because easily influenced minds might be corrupted into billion dollar fraud.
Saw the NYT holding Chris Rufo responsible for the latest shooting; felt pretty wild.
Oh, the "stochastic terrorism" line is a great example. I saw it popping up all over the place and I was mightily puzzled because of cloudy memories of stochastic equations in chemistry, could not figure out what that had to do with terrorism.
What are some 'leftwing-coded' attacks you think would be good examples of this phenomenon? What word should the dictionary promote after such an attack if it wants to balance out promoting "stochastic terrorism"?
Statements by left-ish journalists, activists, and politicians promoted and fanned the flames of the riots of 2020. That seems like a paradigm example of stochastic terrorism on a very significant scale.
Another example would be the rhetoric of the Nation of Islam—an organization Democratic establishment politicians still suck up to. Farrakhan’s bizarre vitriolic anti-Semitism is echoed in events like the Jersey City deli shooting some years ago, along with frequent lower-grade acts of anti-Semitic violence by Black Hebrew Israelites and sympathizers. (Maybe this tendency will realign rightward, though—the bizarre ideology now includes Kanye among its adherents.)
For what it’s worth, I don’t use the term “stochastic terrorism”, I think it’s loaded language promoting the idea that “speech is violence”, a notion I despise. But if I were to adopt the term—it’s obviously not limited to the right.
Is it really that puzzling? Specific examples of whether or not some violent act was because of stochastic terrorism can be debated, but it's very obvious when Tucker Carlson goes on TV and says hospitals are mutilating children and performing illegal surgeries (which they aren't doing), and then hospitals see floods of threats of violence coming in that the two are related. You could either argue that threats of violence are entirely unrelated to actual violence and so the actual terrorists are entirely unrelated, or you could argue that Carlson didn't actually intend for people to threaten hospitals. I think you'd have a hard time arguing either of those.
Do you know what the steelman description of 'stochastic terrorism' is, and do you doubt that it actually exists and is important?
Also, let me ask you this: why did lynchings of black men used to happen fairly regularly, for a set period of time in a set part of the world, and not before or after or in other parts of the world?
2. I don’t doubt that the phenomenon occurs that people speak in opposition to other people in bombastic, violence laden rhetoric and other people do violence based on that. The level of importance is unclear to me when weighed against stuff like being able to criticize cultural practices you find immoral. Any proposed solutions I’ve seen to stochastic terrorism seem worse than the illness.
I don’t think Chris rufo is a good faith actor but holding him responsible for the latest shooting seems implausible. On a personal level I feel pretty weird about drag personas interacting with young children. Based on my own non-zero experience hanging out in gay bars and around drag queens, I see nothing redemptive in this cultural project. If Chris rufo wants to criticize it in bad faith, eh. Live and let live, as my fellow liberals used to say.
I really don’t follow where you’re leading here, but I’ll play: organized and ad hoc white supremacy? The KKK plus white land owners wanting cheap labor and white laborers feeling threatened by slaves and newly freed blacks harming their bargaining power creating incentives for animus along racial lines?
"why did lynchings of black men used to happen fairly regularly, for a set period of time in a set part of the world, and not before or after or in other parts of the world?"
How do you know lynchings or other extra-judicial killing of black men (and white, yellow, red and brown men) did not happen "before or after in other parts of the world"?
The very derivation of the term "lynching" and "lynch law" comes, ultimately, from an Irish source:
The story is that a mayor of Galway, surnamed Lynch, hanged his own son for the murder of a Spaniard because the young man was so popular with the people that no-one else would carry out the sentence. This got brought along to the USA where different lawmen named Lynch were supposed to have carried out hangings, and eventually the term "lynch law" for vigilante justice was adopted as the common term in use.
Are you going to maintain that:
(1) Only people in the USA were ever lynched (or, as I said, otherwise killed by mobs)? That's not so.
(2) Only black men in the USA were ever lynched? Again, that is not so.
But if you mean "mob murder used as a method of terrorising the black population of the American South, amongst other methods of maintaining a racial disparity and subjugation", then yeah, that happened. I don't know the history of black people in the North and if any of them were ever lynched or otherwise unlawfully killed.
So if you are defining "stochastic terrorism" to mean practices like that, I might go along. But "stochastic terrorism" has been used to refer to parents speaking up at school board meetings, as is their right. If the person using the term wants every classroom to be draped in rainbow flags, then they can make that case - but they don't get to call people who oppose that 'domestic terrorists' or 'stochastic terrorism'.
>But "stochastic terrorism" has been used to refer to parents speaking up at school board meetings
By who? Where?
Please share a link, and if your description is accurate, I will join you in laughing at the idiot who misused the term in such a way.
But the existence of idiots doesn't undermine the reality that the term refers to, which I think we agree on (yes I'm referring to the specific US thing).
So an interesting question is why does EA draw this kind of coverage and I have a start of a theory.
I feel that by raising the issue of how effective a donation is EA makes a certain kind of person feel guilty. Before they'd felt good about themselves for any charitable donation but now once cogent arguments are made that some donations are better then others they feel guilty for not putting in the time to figure out the best donation. You can try and tell them that's not the point and no one is suggesting you feel bad about charity but that doesn't stop the feeling of guilt for many people.
And people resent having something they felt good about suddenly make them feel guilty. A common reaction to feeling guilty because of criticism is to lash out at whoever made you feel that way and attack their virtue -- even, or perhaps especially, when the guilt is really coming from your own self-criticism.
I wish I had a good fix for this since it would be a huge boon for EA to avoid inducing this negative affect. To a degree you can minimize it by emphasizing that you can just let givewell figure out what's best but that still leaves some people feeling bad about their emotional desire to donate to help starving dogs or whatever and, even if you insist they shouldn't, the problem is the argument that it does more good to donate this other way is going to make a certain sort of person feel bad if they don't donate that way.
And (though this is much more speculative) for some people, I think there is also an element of who the criticism is coming from, e.g., there is a feeling that these nerds with calculators are invading the domain of journalists and less quantitative types and setting themselves up as arbiters of social value.
Hm, I doubt this. For the simple reason that most people have no clear idea what EA actually is, and roughly zero emotions about them. I doubt very much that they feel guilty because of EA ideas. If they have heard of EA at all before the FTX incident (and not because they investigated the background of FTX), then it is just some of the many many weird group of people doing weird stuff.
A much simpler hypothesis is: this happens every time when something goes wrong. Certainly in cases like this when someone did something evil. But even when something goes wrong without anyone's fault. I have seen it a zillion times. But usually we don't care, because we are not personally attached to the group that is treated unkind.
By the way, I don't think these unfair treatments are a special feature of the press. It's a general human trait to assume that if something went wrong, then people somehow related to that must have done something wrong. It's just that reporters write it down.
I think it's a lot simpler. The media are plugging the EA angle because Bankman-Fried made a big deal out of his philanthropy. If he said he had been influenced by, I dunno, movies about Santa Claus as a kid they'd be all over that one.
This is someone who claimed a virtuous motive but was (all the time? towards the end?) acting viciously. It's the ever-popular hypocrisy angle. Every megachurch preacher who ever ended up in bed with a congregant's wife, or with his fingers in the donation box, or in a motel with a rent-boy gets the same treatment: man of public virtue lives secret life of sin, was it all a con job from the start?
The whole angle of "utilitarianism" and "effective altruism" is strange and unfamiliar enough to the majority of people that it's a novel angle to explore. How could a guy who was being mentored by/working with Big Names in an ethical movement turn out like this? How could the Big Names not realise what was going on? It's the religion angle but with a secular flavour this time round.
It's not just that they are all over the fact that a rich guy who turned out to be sleazy made a big deal of his EA giving (that's fair) but there seems to be an active desire to suggest that EA is somehow rotten because it took money from ppl who wouldn't have done good things with it and spent it doing good things.
I agree there is something to what you say. But I still think it's treated differently than other kinds of charities where slimy rich ppl using them to launder their reputation is basically seen as the cost of doing buisnesses is the following:. bc ppl see EA (wrongly imo) as claiming a kind of higher moral status they then see this as a kind of karma (much like how religious orgs get held to higher standards than other orgs).
It's a perfectly reasonable response if you understand EA as looking down on ppl for their less virtuous giving but that's a mistake. But an understandable one.
Absent a movement of santas making people uncomfortable due to their insistence on effective gift giving (“give gifts to the nice, and fulfill the basic energy needs of the naughty”), I'm not sure the media would go at the santa movie angle. “Those people who made you feel bad about yourself? We're taking them down a peg!” is a necessary ingredient.
But not enough people know about EA to be made feel bad about themselves. Most people's reaction to this news is going to be "There's a what? Called what? That do what?" and thinking it's some weird California techie thing. Not many people knew about the earnestness (at the start) about giving at least 10% and many religious people in the US tithe anyway and wouldn't consider this a reason to feel bad about themselves, neither about the "help poor children in Africa with mosquito nets" thing.
*If* you were aware of EA and *if* you thought you should be donating to good causes and *if* you weren't donating, that *might* evoke the "these people are making me feel bad, I don't like them and want them taken down a peg" reaction. But that's not a lot of people.
I do think the simpler explanation is that this is the old scandal of hypocrisy: guy who claimed to be doing good was in fact a liar and thief.
I certainly agree that the negative coverage of Sam FB (or whatever his name is) is completely explained by what you say. It's why the whole lance armstrong thing was such a feeding frenzy. But that's not what needs explanation.
What didn't happen with Lance Armstrong is anyone writing stories about how cancer charities he supported or stared in ads for where somehow bad as a result. Every time we learn some guy who is a big proponent of metoo is shown to be a creep we don't get stories implying that there is something wrong with metoo.
The relevant ppl here are the journalists writing about it. The phenomenon that is being explained is the attitude taken in the coverage of EA and every one of those journalists necessarily is aware of EA or they couldn't be writing about it.
I agree with you (also, seems that other people on this thread almost entirely agree with you too, though they phrase it differently). I think people feel bad when some person X suggests that he is "holier than thou" - it happens with religious people too, where they cause a similar aversion in the non-religious (or different-religious) and people revel in stories where they violate their own norms. It's not enough that person X be discredited, because their ideology is what makes you feel guilty.
Yeah, hypocrisy is very easy to condemn because it doesn't require any shared ethics whatsoever.
But I do think there's also the aspect of pulling down someone who's got too big for their britches, as it were. When so-and-so is putting on a posh accent, putting on airs, acting white, pretending that they're above it all, I think there's some sort of basic monkey instinct to drag them back down into the muck with the rest of us. To prove that they're no better than anyone else, that it was all just lies and flim-flam, that anyone who gets ahead is doing it by dirty dealing. That we're justified and righteous in our own mediocrity, and that our lack of similar success is not our fault, but in fact due to our virtues. It's all well and good to get ahead, but woe upon those who do it while proclaiming to be better than other people.
“Making a structure with data that does feelings better than regular feelings” is another angle. As if the nerds-with-calculators - who are assumed to lack emotions - used numbers and data structures to make a model of caring, and then used their CaringData to beat everyone else at their own game. Presumably then regular people have nothing left; the nerds are smarter, and CaringData gives them better emotions.
The amount of the fraud is spectacular. But the feeling is still weirdly innocent. Like a crime boss pretending to be a perfect person while doing dirt. They did a lot of morally ambivalent or much worse things - while waving the Emotional Perfection flag. Usually people who do dirt eventually embrace the ambiguity, enjoy the suits and cigars, know the lies are lies. I feel like the conference rooms where the fraud was planned probably had organic fair trade coffee. It’s crime without a crimey feel. It makes the facade of innocence potentially more ridiculous.
Most people when faced with the evidence that this charity works better that charity would - change donations to the better charity. You second point might have some validity though
lots of ppl also like to donate to charities which tug on their heartstrings. If ppl were disposed to behave like you suggest they would already be EAs
Also ppl can get the msg that they should be picking the best one and feel they should do some research but feel guilty when they never get around to it. You see this alot with voting.
I'm surprised to hear you say that and want to explore more about where our models of the world differ. Do you think that most people giving money to their college's alumni fund or their church's soup kitchen simply have never considered that giving to Against Malaria Foundation might be better?
While I disagree with Nolan's claim I'm not sure most ppl have really made the comparison between charities like you suggest. Sure, they may have looked at overhead but I'm not sure most ppl think of charity as comparable in thst way in the same way most ppl don't rank the relative benefits of doing favors/nice things for their various friends/relatives. Sure, they will think at some point: John and James are rich enough they can just hire movers or the like but within a large region no comparisons are made.
Indeed, I think many ppl think of charity as a kind of extension of our normal interpersonal assistance where we usually think that ppl should be disposed to help those they have connections with more. So rather than ask which charity is more effective they ask which cause they feel more connected with (eg if their sister died of cancer or dad died of malaria thry choose thst charity).
Hmm. Despite saying most people I might be talking about myself.
However here in Ireland, after some scandals, where it became clear that most money went to staff and executives in some charities — many of the latter living it up - there was a general movement away from those charities to charities that spent more on recipients than staff. I was part of that. I realise that I was merely changing to charities that were probably more efficient and probably less corrupt bot not certainly more effective.
I try to look at established charities with skin in the game as well, for instance I like Medicine Sans Frontiers. In that case staff costs are high but it’s mostly doctors.
EA, I heard about here first. However then anti malarial net intrigued me so if there was a way to donate to similar charities I would be interested.
> "A common reaction to feeling guilty because of criticism is to lash out at whoever made you feel that way and attack their virtue -- even, or perhaps especially, when the guilt is really coming from your own self-criticism."
Or you could call it "EA Fragility" and write a book!
I think you need to recognize that Ineffective Altruists (donors of church picnics and dog shelters) are the outgroup of Effective Altruists. Non-Altruists are just some far-group. You can recognize this by noticing who annoys you more (if you're sympathetic to EA) - the person who donates to the dog shelter (or donates their hair to cancer patients), or the person who doesn't care much for charity. Who would you feel the urge to argue with and defend against?
EA needs to find a way not to alienate these people, and it's very hard, maybe impossible, not to alienate your outgroup without losing your identity.
Many good points given the superficial and unfair criticisms of EA/longtermism.
That being said, while EA does a lot of good, it suffers from many genuine structural problems media do not mention. EAs are occasionally aware of these, but mostly consider them as negligible/intractable. A friend of mine mentions naive, antisocial utilitarianism attracting Machiavellian types (not that properly contained utilitarianism is bad), partisanship and political bias way exceeding what's necessary to operate on the nonprofit scene, general mistreatment of volunteers, interns, and low-ranking employees, dynamics creating vicious loops worsening mental health, self-recommending tendencies, lack of transparency leading to distrust in trendsetting institutions, unproductive credentialism/elitism, and major loopholes in cost-effectiveness models.
In the end, "sort-of-EA-adjacent-people" become a better match for Great EAs than Prominent EAs.
The reason AE is taking so much heat is that its richest and most visible proponent a) claimed to know how to make the world, in general, better (AE principles), but b) actually made the world a lot worse (by actually being a cynical crook). It is obvious why this would result in negative press for AE and why people would update their priors in a way that is unfavorable to AE. If the head of Mothers Against Drunk Driving stole several billion dollars, people would also update their priors against MADD. What is actually interesting about the media coverage is not that it turned against the movement of crypto Bernie Madoff but that it was so credulous in the first place about a guy who claimed to be donating all his money to AE charities but actually owned a jet.
I mean, that is fair enough. What annoys me though is not people arguing that EA institutions have serious problems or are corrupt so much as the arguments that the entire philosophy of EA is somehow discredited. To use your example, it's like if the head of MADD stole a bunch of money and people used that to argue that drunk driving is actually fine and we were wrong to ever worry about it.
Yeah we probably should. Not with posts like this, though. Response should be thoughtful and charitable. It's also ok to vent with a sympathetic crowd.
I disagree there. Responses to good faith criticism should be thoughtful and charitable. Response to bad faith criticism and mockery should be snark and trolling.
Detecting the difference between good faith and bad faith is very very hard, and the "if they're serious then they're so negligent it might as well be bad faith" doesn't, in my opinion, actually get you out of that problem. I would guess most of what's happening is "there's a current event and a thing I want to say and they have the same word near them, that's good enough for an article," not "how can I dishonestly ruin EA's day?"
I agree it’s really hard to tell. I just don’t think the solution is to always assume good faith as a result. There’s a cost to both Type 1 and Type 2 errors, and IMO EAs heavily bias towards trying to assume and respond in good faith, even when it’s not optimal.
I admit I don't pay a lot of attention to EA beyond what I read around here, but my impression so far is that it's basically a fairly trite set of ideas espoused by people who don't understand the world very well and don't get the point of the old saying that the road to hell is paved with good intentions. But I do agree that some of the recent FTX/SBF-related articles that are critical of EA are superficial and silly.
Do you agree with them? Do you think more than 5% of people in the US agree with them? If not, I don't think they can be described as "trite", except in the same sense where Christianity is "trite" because people have been doing it for a long time.
I think people actually *don't* agree with the only thing that is unique to EA, which is the set strange and allegedly scientific assumptions it overlays on top of standard outcome-oriented philanthropy.
If everyone agrees with the claim "it's morally important to donate 1-10% of your income or some other equivalent amount of some other resource to whatever charity you think is most effective at decreasing suffering / increasing utility", how come so few people (I would guess <1%) do this?
I think this is the main claim of effective altruism, any additional strange assumptions are just squabbling about implementation details from within the movement.
I think if your only definition of effective altruism is that you have to donate a small to large portion of your income to charities you think are good then you are defining the movement in a way that is deliberately so broad as to make EA unworthy of having it's own name. It would just be charity at that point. You of all people should not be hiding in a Motte. Strange and unpopular ideas like focussing on IA risk, discounting future harms very highly, and quantifying the utils of a chicken sandwich aren't squabbling over the details. They are the only things that separates it from the standard concept of charity, which is the thing everyone actually likes.
I specifically said 1-10% because Peter Singer says 1%, Toby Ord says 10%. I think less than half of people give 1% and less than a twentieth give 10%. I also said "to charities that most effectively reduce suffering/promote utility". Unless you have some kind of galaxy-brained argument, that excludes churches, colleges, local food banks, theaters/operas/symphonies, and most other things Americans donate charity to. I think only between 1-5% of people fit these criteria depending on whether you use Singer's 1% number (5%) or Ord's 10% number (1% or less).
I know everyone wants you to believe that EA is only about speculative AI projects, but about 2/3 of EA donations still go to global health, malaria, things like that. There are many people in EA who have never touched animal welfare or AI risk.
I’ll try to steelman this a little, as someone who only makes it up two and a half levels on the EA tower.
There IS definitely novelty and value in the idea of systematically focusing on outcomes and donating 10%. The problem is that, in the real world, the EA movement is associated with the entire tower, not just its lower levels.
The reason there’s no sympathy for EA is that they’ve begun to focus on things that have huge and obvious failure modes. Think about the concept of “earning to give.” There’s natural skepticism to that because it just sounds too much like a justification someone would give for their own extreme selfishness. For every person who would truly earn to give, there are a dozen who would say they’re doing that as an excuse to enrich themselves.
Another example: The issue of AI Risk. At the end of the day, investing in AI risk means funneling massive salaries to PhDs, i.e. highly-educated elites. That’s terrible optics, because suddenly EA goes from “giving to the poorest in the world” to “paying rich people’s salaries” and it doesn’t matter what the justification is, it looks really bad and stinks of motivated reasoning to outsiders.
And to be clear, I’m not saying AI Risk isn’t a real thing we should worry about. I’m saying that no one should be mentioning it in the same breath as global health. That’s a fine-line to tow, I know it’s complicated when so many members care about both those things, but I still believe it was a mistake to add more levels to that tower.
Before 2016, ask the average college student whether "all lives matter." Most everyone would have said yes to that general sentiment. But now, those words have become particularized - loaded with specific, additional, controversial content - and it's harder to get a "yes".
Similarly, ask anyone whether they agree with: "giving some of what you have to those in need is good." Sure, most do! But that doesn't mean they have to agree with your "giving 1-10% to charities that most effectively reduce suffering", given the specific controversial consequentialist loading you've given to "effectiveness". Even if it seems like a mere rephrasing, the introduction of loaded language will still lead people to reject the latter sentence. There's no inconsistency there - most people just aren't consequentialists.
You've become unable to understand people who think sufficiently differently from you. I think once you would have tried to understand and fix them; now you rant at them, impatient with the fact that they haven't just started giving their 10% already. And I have become impatient with your impatience; I think I'll unsubscribe. It's for the best - the bad outweighs the good in subscriber-only posts. I'll miss the poems, but I'm not going to pay money to get lazy ragebait in my inbox, there are more than enough news sites offering that service for free.
Good intentions don't always lead to hell. How can we convert good intentions into the least hellish outcome? Perhaps some movement that attempts to think hard about all the possibilities, constantly seeking constructive criticism.
This is kind of how I think of EA. I had decades of frustration around giving. I was low income but knew that there were places in the world where even the meager amount I was able to give could have a positive impact. And I wanted that. Discovering GiveWell was the answer for me. I was sick of looking into charities that attempted to provide clean water or education, only to find that they were ignoring local customs and infrastructure limitations, and creating more mess than happiness. EA did indeed seem to convert good intentions to good outcomes instead of hellish ones.
>Young people should resist the lure of political activism and stick to time-honored ways of making a difference, like staying in touch with their family and holding church picnics.
Based and clear pilled.
(I'd guess the media is unfair to EA sometimes mostly because it's still fairly novel and doesn't have a "this is just normal and how things are" vibe yet. But as others have pointed out, the media is often unfair to anyone it considers a threat - an EA is implicitly a pretty big power grab)
It’s good that you’re trying to be helpful but some ideas are just too dangerous to share, like effective altruism or Curtis yarvin.
(This is a joke, but made uncomfortable for me because I think Curtis yarvin borders on “stuff you shouldn’t put into your brain because it makes you a worse person.” He’s like a sith version of an ayahuasca shaman.)
Actually, it might be a good idea to expand a bit on Incanto's interesting, perceptive comment.
In Scott's post, "resist the lure of political activism" occurs in a parodic context. It's not meant seriously, but as a reductio ad absurdum: talking this way is so obviously ridiculous that all Scott needs to do is to point out the similarity with how the media talks about EA, and the point is made.
In contrast, Yarvin, with his "clear pill", really does want to convince people to resist the lure of political activism. I think it's good to highlight that disagreement, since Scott has been accused of secretly supporting Yarvin.
I think Scott has been pretty vocal about thinking political activism is a high effort, low reward activity. Indeed, this is a premise of much of EA: otherwise they’d be all “help reduce suffering, vote Democrat/NRX/Ron Paul!” The rationalist sphere has been instrumental in pushing me out of political activism, if only by providing me a more humane framework to replace the totalizing lefty worldview.
While this might be big-picture true about the way EA is covered, the notable part about its coverage this time around (FTX situation) is that it’s been insanely positive/hands off. There have been opinion pieces and such from EA haters, but I found the establishment news to be remarkably sympathetic toward SBF and his goals to change the world for the better.
Separately, a few of these fake headlines make good points!
I think EA's are to some extent selectively seeing negative coverage (like performers who lose sleep over that one negative review). It seems to me that most popular news sites don't go into details about EA when covering FTX, or don't cover it too negatively if they do. It's true that for a certain subset of nerdy and judgy people who pay attention to EA and decided that they hate it, SBF is a godsend and a confirmation of everything they believed is wrong about it. But in my experience these are people who already made up their minds and even they will soon move along to expressing their condemnation of the next trending thing they hate. That said, I enjoyed the examples here!
It's not just that they are all over the fact that a rich guy who turned out to be sleazy made a big deal of his EA giving (that's fair) but there seems to be an active desire to suggest that EA is somehow rotten because it took money from ppl who wouldn't have done good things with it and spent it doing good things.
I mean c'mon, the media isn't so naive as not to recognize that every bad person ever tries to associate themselves with charitable giving. Plenty of major international charities happily take money from the sleeziest ppl around without generating much negative comment.
But I think a better way to phrase my claim is this: bc ppl see EA (wrongly imo) as claiming a kind of higher moral status they then see this as a kind of karma (much like how religious orgs get held to higher standards than other orgs).
The orthogonality thesis in action: the media as highly capable agents, bringing considerable wherewithal to dystopian agendas built on horrifyingly confused conclusions.
(1) "Young people should resist the lure of political activism and stick to time-honored ways of making a difference, like staying in touch with their family and holding church picnics."
Isn't this the same approach as "your single vote doesn't count, stay at home and don't vote" argument which I've seen in the comments on this here site before?
(2) "Obviously this can only be because he’s using his photogenic happy family to “whitewash” his reputation and distract from Facebook’s complicity in spreading misinformation."
I've *seen* the Metaverse trailers, where Zuckerberg is taking a call from his wife (I think she's his wife?) sending him a video about cute stuff their dog is doing. They still fail to convince me that this is anything other than set up as a marketing ploy:
Does this look like something a "photogenic happy family"-man would willingly use as his global image selling-point, or what an android thinks a human would pick?
Odysseus, being smart, tried his best to avoid being drafted for this hopeless war. Achilles' mother dressed him up as a girl and sent him to live with the princess of another court among her maidens to keep him away. Many of them at the time knew getting pulled in was a bad idea, but due to the web of alliances and obligations, they had no choice about turning up to support Menelaus and Agamemnon.
Rather like the First World War, where the assassination of an Austrian arch-duke spiralled into global conflict because everyone was caught in a spider's web of connections.
I'm sympathetic to the concept of EA, and give some of my annual donations to GiveDirectly, who I believe is somewhat EA-aligned. I'm not familiar with the community, so maybe this question is ignorant.
Why was SBF embraced at all by the EA community? It seems most of his giving was political, and not particularly effective. He gave $20 million to a single congressional candidate in Oregon who got smoked in the primary and a couple hundred million to a Biden IE in 2020, where the marginal impact of a dollar is pretty damn low. lsn't this kind of giving the kind that will personally give the donor a lot of prestige, but is low on the measurable impacts EA embraces? (FWIW I do give money politically, but I see that as separate than pure charitable giving). Were there people in the EA community before SBF crashed saying, "this guy is not doing EA?"
The $20 million Oregon guy was Carrick Flynn, who formerly worked at the Future of Humanity Institute and was an EA really concerned about x-risk. SBF thought it would be good to have someone like that in Congress helping propose/pass x-risk related laws. I think everyone now agrees this was a bad idea, but I don't think it was insane at the time.
SBF also promised a few billion dollars to the Future Fund, a group led by EA philosophers that was going to try to donate it in the most EA way they could think of. I think they ended up distributing $200 million before FTX collapsed, which was about 50% of EA's funding during that time period.
In terms of him being "embraced by EA" - he used to be a staffer at the Center for Effective Altruism before he left to start his crypto company. So I think he started out part of the community and then became rich and we weren't going to expel him from the community just for being rich. Also, we were briefly getting 50% of our funding from him, and although I don't know if that counts as "embracing" him, I think it would have been dishonest to deny that this made him important and relevant.
I don't think it was an insane idea to support a political candidate, but I think politics, an area where people spend tons and tons of energy analyzing debating how things should be done, is the exact wrong place to try to bring in EA principles. Coming in saying, "I used math and figured out how to effectively do politics" seems very hubristic and probably wrong.
I personally think EA has a lot more to give in the realm of charitable giving, where there is very little thought given by most donors and very little critical press coverage. It seems in that realm there are huge gains to be made by asking simple questions about effectiveness, and I'm not sure why EA (speaking as a very uninformed outsider) has drifted from providing basic cost-benefit analysis on charitable giving to nebulous realms like electoral politics.
I don't think political donations are really considered an EA cause. SBF was donating a lot personally to politicians but aside from being an EA donor he was running a large crypto business that was heavily lobbying to enact favorable crypto regulations so he could operate in the US. I suspect that had quite a bit to do with his political donations.
As context here, Martin Blank has no particular insight into either EA spending (of which the supermajority is still on direct global poverty or animal welfare) nor the arguments or the manner which political involvement was discussed. (Which has been to one particular political candidate and when not, specifically on the issue of pandemic preparedness). I guess they'd pick a mayor if they thought NIMBYism was an important local cause, but this action is definitely not in his world model.
Do not misinterpret his confidence and volume of posting for actual information about EAs.
(FYI Mike, as an EA I am very much with you on the less effective nature of politics, and most EAs would talk about how systemic political change is a false prophet. However they were seduced by https://www.overcomingbias.com/2007/05/policy_tugowar.html and thought that pandemic preparedness was different enough that the wastefulness of polarization doesn't apply, and that pandemic preparedness was sufficiently neglected and worthwhile that even long shots were good.)
You are taking the example way too literally. The point is x risk and aI risk and political campaigns are sexy. Being a clearinghouse for research on charity effectiveness is not.
And I don’t need to be an expert in what EA are doing because they talk about it constantly, so you can just listen to what they actually say! Wild eh?
Yes and how many EAs have you witnessed talking about global poverty and then switch to AI risk? I would estimate close to zero; because in fact the global poverty subsection of EA is distinct from the X risk subsection of EA! You're welcome to link to examples or name names, but I suspect you're mistaking the order in which you're hearing things as the direction EA thought has evolved.
The most respected organizations in EA are Givewell and 80000 hours, the former being a clearinghouse for charity research and the latter being career advice, with incidental advice on AI risk if that turns out to be the cause priority they are interested in. I believe this because most EAs give AMF and other givewell endorsed charities as the default. MIRI and other Xrisk organizations do not have nearly the mindshare that givewell does, and this has remained true. The fact that an Oxford philosopher has published a book on longtermism and had a large media campaign about it does not mean that Global Poverty EAs have vanished in a poof of logic.
The closest thing I can think of to support this PoV is https://forum.effectivealtruism.org/posts/83tEL2sHDTiWR6nwo/ea-survey-2020-cause-prioritization, but the survey finding higher EA engagement correlating to more AI risk engagement is entirely consistent with global poverty causes becoming more popular and the original core of EA containing a sub population concerned with high AI risk. To mention this without caveats or linking to evidence is bad form.
If you aren't an expert in EAs and are just listening to what they say, I believe it's proper to assume that you've properly vetted which of your statements are supported and which ones aren't and to distinguish between them. Since you haven't and haven't provided any basis for your blustering, boviating and bulverism about "what EAs really think" , especially when the facts on the ground about popular causes and amount of money moved contradict your narrative, you instead focus on one throwaway sentence not central to my point.
Forget altruism, I find this type of behavior disgusting because it's compromising plain old humanist values of honesty, engagement and thoughtfulness. When encountered with evidence that what you believe isn't the whole picture, you equivocate and claim that what they're doing doesn't end up mattering at all, in a thread about why has EA changed what they are doing.
As always, if you have evidence or have good reasons that you have not shown, I am wrong in my dismissal and my evaluation of your character and I will withdraw my previous statements if you are willing to provide reason or evidence.
Some EA causes such as pandemic prevention have a natural overlap with politics. An NGO can lobby, raise awareness or do research, but the actual work has to be done by the state as it is the one with the required medical, bureaucratic and regulatory infrastructure.
Also "doing politics more effectively than present-day politicians are doing it" doesn't seem like a hard bar to clear. And once you are doing cost-benefit analyses of altruistic causes, it's natural to start to worry about whether you are over-focusing on the kind of causes which are easy to analyze, which doesn't necessarily mean their cost-benefit ratio is better; and there is a lot of money in politics. The total yearly effective altruism budget is somewhere on the scale of $1B. The Build Back Better bill in its initial version would have spent about 3500 billion dollars to a wide array of causes. A few politicians committed to effective altruist causes might be able to move quite significant amounts of money as part of negotiations in an evenly split Senate. Even if you just look at cost-neutral common-sense improvements to some things that aren't really commonsensical currently, there are many opportunities.
If you want to get the government to spend money on pandemic preparedness, that's a niche enough issue you can probably get bipartisan support if you lobby correctly. There are hundreds of lobbies you can learn from, and get it pushed through the "secret Congress" Yglesias writes about. Spending $20 million on a single congressional race, and having it backfire when people question the source of the money, is not effective.
As for politics, there are huge debates about what is effective government. I am on the political left and would probably have huge disagreements with many people in this thread, while we'd likely all agree about malaria nets.
I donate politically and believe my donations are altruistic. But my donations are also my attempt to impose my ideas and values on the country. I personally think my ideas and values are right, but many disagree, hence political divisions. To some extent, politics is a game we play to compete for power and score wins for our tribe. For that reason I think politics is too complicated to apply EA metrics. There are entire academic disciplines - political science and economics - where people try to determine effectiveness in politics. It is not easy to determine.
The only reason I enjoy a tiny bit of schadenfreude in all of this is that some—undoubtedly not all—seem to give off the impression that they think that before they came along a few years ago nobody else in history ever had the insight that charitable donations are best directed toward where they will do the most good. Rather, everyone else had uncritically donated money to the local opera house with no idea that there could be any issue there. If the same community had done more or less the same substantive work but—instead of making up a cute name for themselves and seeming to think they had invented a novel theory of charitable giving—had just said they were doing some work to help figure out really good recipients and to think through some of the issues related to charity, then they would have much more of my sympathy. It goes without saying that none of this justifies poor media criticism. But as others have noted there has been plenty of equally poorly reasoned articles supporting “effective altruism” and attacking other views. And it’s worth noting that the very hubris of designating themselves as some grand new movement undoubtedly contributed to the lack of media understanding.
Your use of the word "seeming" is doing an awful lot of work to make you comment not technically incorrect. Never the less, you are ascribing an positions to people that do not hold them which is illogical and unjust.
What would you characterize as the novel insight that warrants the invention of a special name like “effective altruism”? I thought it was more or less developing the idea that donations should be directed toward where they will do the maximum good, but I (seriously) stand ready to be corrected if there is something else there, as opposed to just doing good work helping to develop and apply principles that had long been recognized beforehand?
No novel insight is necessary to "warrant" a name like Effective Altruism. Groups can name themselves basically whatever they want without any additional justification.
They can absolutely choose to name themselves. But it does tend to convey the impression that they have some new foundational ideas that separates them from those who fall outside the group or preceded it. At least in my opinion.
You will find yourself perpetually confused if you adopt that mentality. Most groups don't have a new foundational idea, but get named all the same. Effective Altruism succinctly describes the groups goals. That alone makes it a better name that most.
P.S. Ascribing positions to people that do not hold them is still illogical and unjust even if you give the caveat that it is just your opinion.
What I was characterizing as my opinion was a view about what this conveys to reasonable people, not to me in particular, so it’s either right or wrong; either way your postscript is confused. But yes, in this case I do think that EA is ordinarily viewed as offering something conceptually new, and that the name, while not solely responsible for that impression, contributes to it.
I'll note that the following have been presented as clear and obvious reasons for why EA as a movement is psychotic:
1. Caring about someone that isn't your family means not caring about your family at all, making EA, meaning it is a self defeating / anti social philosophy.
2. Trying to do things that you don't personally do or see for yourself means it's impossible to evaluate effectiveness.
3. Existing status quo morality and actions like doing stuff for soup kitchens or helping the homeless is already maximally virtuous, and trying to do anything other than that is a distraction
4. Doing any sort of Monetary donation is mere cover for other nefarious activities.
Note that these arguments are being made in this very comments section, not to mention actual articles about EA. Have you not noticed them? If you have, why do you think those statements are consistent with the world where everyone obviously believes doing good better is what Charity is for, as opposed to signaling care, conscientiousness or loyalty?
I think these objections, most of which are equally applicable to utilitarian systems more generally, have been around a long time before a bunch of people decided to make up the name “effective altruism.” I personally think that some of these critiques have some merit to them, but I wouldn’t overstate them. But respectfully, I don’t view them as really novel to EA.
Is your problem that EA claims are not novel, therefore they don't get a new name? Would you apply this standard to the Abolitionists? Any number of new religious denominations with comparatively minor theological disagreements? The constitution, since most of the ideas are derived from enlightenment era philosophers, does not deserve to be considered separate?
Your standard, if taken at its word, implies that EA is not even special in terms of how derivative it is, and is an isolated demand for originality.
Abolitionists were a group of people who worked for the abolition of slavery. If a new group of people came up in the middle of the project and announced that they were a cool new movement called “Effective Abolitionism” then I would want to ask what novel ideas separated them from the broader movement. EA (the real EA, not the hypothetical abolitionist analogue) has received a lot of plaudits and cultivated an image of novelty and being on the cutting edge. If the defense is that they aren’t offering novel ideas after all then I would say they should have taken a less hubristic-seeming approach and done good charitable work without presenting themselves as a special vanguard.
If there is an existing network of people who uses broadly EA principles that already exists (using gold standard RCTs, trying to optimize charitable giving in some fashion, picking careers to do the most good) then I have not heard of them. What named movement are you referring to, and why haven't I heard of them despite wanting to find people who do not reflexitively demand you to tow the morality line?
We keep going back and forth between “no there’s nothing novel here and there needn’t be” and “yes they are novel; name me someone else doing this!” I was responding to the former argument. As to the second, there has been a long history of altruism with efforts to judge and assess the best recipients. This isn’t a new idea. It’s fantastic for a bunch of smart people to work on these issues and to place an emphasis on generosity. But I don’t see anything conceptually novel here. I’m open to argument, though, if you want to give me one. But then if the answer turns out to be “no there isn’t any conceptual novelty here but you don’t need that,” then I revert back to the argument in my prior comment.
I don’t think the human urge to pile on when someone screws up in a spectacular way is a very new thing. Nor the urge to find someone to blame when things go south.
The old saying, no good deed goes unpunished, comes to my mind.
It is interesting to me how so many people seem to of reached the conclusion that SBF was a ConMan through and through when the facts on the ground don’t even come close to supporting that. That might change, but right now the only clear thing is that he misdirected a whole bunch of money to cover up a gaping hole in the balance sheet somewhere else, and (I imagine) assumed that crypto would start going up again and he would be able to make everyone whole. How much damage has been done to a lot of people by the crash in crypto currencies that has nothing to do with FTX per se? In my opinion, comparing him to Bernie Madoff at this juncture in the investigation is really over the top. Bernie Madoff ran a Ponzi scheme for years with no intention of making anyone whole. What endgame was in his mind I will never know. Nothing has come out of this yet that puts SBF anywhere near that league.
I don't think SBF is being portrayed unfairly. Yes it's a different scam than Madoff, but the amount of scam is still superlative. Maybe SBF legitimately thought he could make people whole in the end, but there was so much book-cooking involved here that there's no innocent explanation. We don't know all the details--and likely won't for years as this gets disentangled--but we know enough to know it's bad.
Historically almost everyone who pulled something off like this had some plausible theory in their mind how they would “eventually make everyone whole”. Charles Ponzi included.
I am not sure that the original Charles Ponzi cared much about that, but I am no expert. And I am not defending him so much as applying my own sensibilities to the information I have available. So far it is more Ken Kesey and the Merry Pranksters then it is the James gang.
The original Charles ponzi for sure thought he was going to dig his way out. The guy was a lifetime grifter and loser, but he didn’t have some sort of business arbitrage idea he built it upon.
You could well be right. I looked up Ponzi and it didn’t settle anything for me. Did he think everyone would come out alright in the end, or did he know he was screwing them? Or does it matter?
In some corners of the media, it is being portrayed rather matter of factly. In other places, not so much. we will see how it unfolds, but I was just stating my opinion about what I perceive as a rush to judgment in some circles, and more than a little of, “I’m shocked to find gambling going on here, Rick.”
Just get the NYT to cover you or something you care about, in a politically-chargered context where their politics incentive them to be misleading. The feedback loop shorts out the chip, causing first pain, and then a feeling of emptiness and loss, followed by a sense of isolation as you will then be cut off from the collective. After that, all you need to do is meditate for half an hour and you'll be able to pinpoint the location precisely.
Maybe not a conman, but if you read the bankruptcy filing, you wouldn't run a school charity sale the way he ran things (and I know, I've had to prepare paperwork for our auditors including details of the petty cash; if we have to account for a couple of thousand over a year, what the heck were the FTX auditors doing with regards to billions? If there even *were* auditors, which there may not have been, which is another problem):
"I have over 40 years of legal and restructuring experience. I have been the Chief Restructuring Officer or Chief Executive Officer in several of the largest corporate failures in history. I have supervised situations involving allegations of criminal activity and malfeasance (Enron). I have supervised situations involving novel financial structures (Enron and Residential Capital) and cross-border asset recovery and maximization (Nortel and Overseas Shipholding). Nearly every situation in which I have been involved has been characterized by defects of some sort in internal controls, regulatory compliance, human resources and systems integrity.
Never in my career have I seen such a complete failure of corporate controls and such a complete absence of trustworthy financial information as occurred here. From compromised systems integrity and faulty regulatory oversight abroad, to the concentration of control in the hands of a very small group of inexperienced, unsophisticated and potentially compromised individuals, this situation is unprecedented.
...I have been provided with an unaudited consolidated balance sheet for the WRS Silo as of September 30, 2022, which is the latest balance sheet available. The balance sheet shows $1.36 billion in total assets as of that date. However, because this balance sheet was produced while the Debtors were controlled by Mr. Bankman-Fried, I do not have confidence in it, and the information therein may not be correct as of the date stated."
That last is a running refrain all through the filing: "I do not have confidence in [statement provided] and the information therein may not be correct as of the date stated." I would recommend reading the filing; it breaks down the structure of the entire house of cards and is entertaining on top of that, not something you can ordinarily say about financial documents, due to Mr. Ray the Third being hopping mad at the mess he has been landed with to clean up.
Yep, looks like auditors never went next, nigh or near this set-up:
"Alameda Research LLC prepared consolidated financial statements on a quarterly basis. To my knowledge, none of these financial statements have been audited. The September 30, 2022 balance sheet for the Alameda Silo shows $13.46 billion in total assets as of its date. However, because this balance sheet was unaudited and produced while the Debtors were controlled by Mr. Bankman-Fried, I do not have confidence in it and the information therein may not be correct as of the date stated.
...To my knowledge, Debtors Clifton Bay Investments, LLC and FTX Ventures Ltd. prepared financial statements on a quarterly basis. The September 30, 2022 balance sheet for Debtor Clifton Bay Investments LLC shows assets with a total value of $1.52 billion as of its date, and the September 30, 2022 balance sheet for FTX Ventures Ltd. shows assets with a total value of $493 million as of its date. To my knowledge, none of these financial statements have been audited. Because these balance sheets were unaudited and produced while the Debtors were controlled by Mr. Bankman-Fried, I do not have confidence in them, and the information therein may not be correct as of the date stated.
I have not been able to locate financial statements for Island Bay Ventures Inc."
Has Mr. Ray mentioned that he does not have confidence in the information? Because he doesn't, you know 😁:
"The FTX.com platform grew quickly since its launch to become one of the largest cryptocurrency exchanges in the world. Mr. Bankman-Fried claimed that, by the end of 2021, around $15 billion of assets were on the platform, which according to him handled approximately 10% of global volume for crypto trading at the time. Mr. Bankman-Fried also claimed that FTX.com, as of July 2022, had “millions” of registered users. These figures have not been verified by my team.
The Dotcom Silo’s unaudited consolidated balance sheet as of September 30, 2022 is the latest balance sheet that was provided to me with respect to the Dotcom Silo. It shows total assets of $2.25 billion as of September 30, 2022. Because such balance sheet was produced while the Debtors were controlled by Mr. Bankman-Fried, I do not have confidence in it, and the information therein may not be correct as of the date stated."
The lawyers are going to make *fortunes* out of disentangling this. One (headquarters) for everyone in the audience!:
"As mentioned above, Alameda Research LLC is organized in the State of Delaware. The other Debtors in the Alameda Silo are organized in Delaware, Korea, Japan, the British Virgin Islands, Antigua, Hong Kong, Singapore, the Seychelles, the Cayman Islands, the Bahamas, Australia, Panama, Turkey and Nigeria.
...On November 10, 2022, the Securities Commission of the Bahamas (the “SCB”) took action to freeze assets of non-Debtor FTX Digital Markets Ltd., a service provider to FTX Trading Ltd. and the employer of certain current and former executives and staff in the Bahamas. Mr. Brian Simms, K.C. was appointed as provisional liquidator of FTX Digital Markets Ltd. on a sealed record. The provisional liquidator for this Bahamas subsidiary has filed a chapter 15 petition seeking recognition of the provisional liquidation proceeding in the Bankruptcy Court for the Southern District of New York.
In addition, in the first hours of November 11, 2022 EST, the directors of non-Debtors FTX Express Pty Ltd and FTX Australia Pty Ltd., both Australian entities, appointed Messrs. Scott Langdon, John Mouawad and Rahul Goyal of KordaMentha Restructuring as voluntary administrators."
We're a registered charity, we are bound by a load of regulations, including the necessity for regular board meetings and records of same. If only we were a fancy-schmancy crypto outfit, we'd be able to get away with murder.
Governance and cash management were terrible to non-existent, and Mr. Ray is making sure Bankman-Fried and/or others can't access the bank accounts:
"Many of the companies in the FTX Group, especially those organized in Antigua and the Bahamas, did not have appropriate corporate governance. I understand that many entities, for example, never had board meetings.
The appointment of the [new, independent] Directors will provide the FTX Group with appropriate corporate governance for the first time.
...The Debtors have been in contact with banking institutions that they believe hold or may hold Debtor cash. These banking institutions have been instructed to freeze withdrawals and alerted not to accept instructions from Mr. Bankman-Fried or other signatories. Proper signature authority and reporting systems are expected to be arranged shortly.
Effective cash management also requires liquidity forecasting, which I understand was also generally absent from the FTX Group historically. The Debtors are putting in place the systems and processes necessary for Alvarez & Marsal to produce a reliable cash forecast as well as the cash reporting required for Monthly Operating Reports under the Bankruptcy Code."
Turns out *some* of the web of companies *did* have auditors - that were based out of the Metaverse 🤣
"The FTX Group received audit opinions on consolidated financial statements for two of the Silos – the WRS Silo and the Dotcom Silo – for the period ended December 31, 2021. The audit firm for the WRS Silo, Armanino LLP, was a firm with which I am professionally familiar. The audit firm for the Dotcom Silo was Prager Metis, a firm with which I am not familiar and whose website indicates that they are the “first-ever CPA firm to officially open its Metaverse headquarters in the metaverse platform Decentraland.
I have substantial concerns as to the information presented in these audited financial statements, especially with respect to the Dotcom Silo. As a practical matter, I do not believe it appropriate for stakeholders or the Court to rely on the audited financial statements as a reliable indication of the financial circumstances of these Silos.
The Debtors have not yet been able to locate any audited financial statements with respect to the Alameda Silo or the Ventures Silo.
The Debtors are locating and securing all available financial records but expect it will be some time before reliable historical financial statements can be prepared for the FTX Group with which I am comfortable as Chief Executive Officer. The Debtors do not have an accounting department and outsource this function."
"I have substantial concerns" - you don't say, John J.?
Did you work for FTX at any point in time? You never know, you might have done! They don't know themselves who exactly did or didn't work for them!
"The FTX Group’s approach to human resources combined employees of various entities and outside contractors, with unclear records and lines of responsibility. At this time, the Debtors have been unable to prepare a complete list of who worked for the FTX Group as of the Petition Date, or the terms of their employment. Repeated attempts to locate certain presumed employees to confirm their status have been unsuccessful to date."
Though credit where it's due, Mr. Ray does appreciate those who were doing their jobs as best they could:
"Nevertheless, there is a core team of dedicated employees at the FTX Group who have stayed focused on their jobs during this crisis and with whom I have established appropriate lines of authority and working relationships. The Debtors continue to review personnel issues but I expect, based on my experience and the nature of the Debtors’ business, that a large number of employees of the Debtors will need to continue to work for the Debtors for the foreseeable future in order to establish accountability, preserve value and maximize stakeholder recoveries after the departure of Mr. Bankman-Fried. As Chief Executive Officer, I am thankful for the extraordinary efforts of this group of employees, who despite difficult personal circumstances, have risen to the occasion and demonstrated their critical importance to the Debtors."
Getting back to "you did *what*???" territory:
"The Debtors did not have the type of disbursement controls that I believe are appropriate for a business enterprise. For example, employees of the FTX Group submitted payment requests through an on-line ‘chat’ platform where a disparate group of supervisors approved disbursements by responding with personalized emojis.
In the Bahamas, I understand that corporate funds of the FTX Group were used to purchase homes and other personal items for employees and advisors. I understand that there does not appear to be documentation for certain of these transactions as loans, and that certain real estate was recorded in the personal name of these employees and advisors on the records of the Bahamas."
The really serious allegations when it comes to who was handling the money and where it went (or didn't go):
"The FTX Group did not keep appropriate books and records, or security controls, with respect to its digital assets. Mr. Bankman-Fried and Mr. Wang controlled access to digital assets of the main businesses in the FTX Group (with the exception of LedgerX, regulated by the CFTC, and certain other regulated and/or licensed subsidiaries). Unacceptable management practices included the use of an unsecured group email account as the root user to access confidential private keys and critically sensitive data for the FTX Group companies around the world, the absence of daily reconciliation of positions on the blockchain, the use of software to conceal the misuse of customer funds, the secret exemption of Alameda from certain aspects of FTX.com’s auto-liquidation protocol, and the absence of independent governance as between Alameda (owned 90% by Mr. Bankman-Fried and 10% by Mr. Wang) and the Dotcom Silo (in which third parties had invested)."
Go read the whole thing, it'll explain why this is such a tangled mess.
Even if Bankman-Fried wasn't a fraudster, or didn't start out as a fraudster, the way he (and it really is majorly down to him) ran the concern was terribly careless, so it's not surprising they lost money because nobody really seems to have known what the hell was going on or who was doing what.
The Sequoia article is also a great look into what was going on. The writer was won over by Bankman-Fried's charisma (or whatever field of conviction he had going on), so he ends up giddy with praise, but the warning signs are there, in hindsight:
"The HQ building is distinguished by a reception desk in the microscopic lobby. The door is unlocked. There is no receptionist. I peek around the corner and into the FTX command center—29 desks in a room designed to hold 8, at most. Every desk touches two or three others. There are no aisles. To get across the room, you have to wade through (and, at times, climb over) a sea of office chairs. Walls of wide-screen monitors—two, four, even six per desk—stand in place of cubicle walls. The screens erupt like palm leaves from aluminum uprights and are oriented willy-nilly: up, down, sideways. Some screens are mounted so high they seem to hang down from the ceiling. It’s office environment as jungle, and the oddest thing about it is that no one seems to be home.
...At first blush, the scene is classic startup—the kitchen full of snacks and soda; the free catered breakfasts, lunches, and dinners; the company bathroom stocked with everything you’d need to actually live at the office: Q-tips, disposable razors, Kotex. In keeping with the fashion aesthetic of senior management, the dress code is marketing-swag-meets-utilitarian-merch: gift-bag T-shirts featuring the FTX logo, nylon athletic shorts, white-cotton gym socks.
But, as the week wears on, it’s the differences that start to stand out. FTX is not your ordinary startup. Most noticeable is the average age of the employees. Among senior management, SBF himself has just turned 30; Singh is 28; Arora is the old man of the group at 35. The company is also remarkably international. You hear the rapid-fire rhythm of Mandarin as often as English, but even that lingua franca comes in a wide variety of flavors—everything from a Bahamian lilt to the broken argot of ESL."
And here's a guy who is probably regretting his life-choices right now:
"Can Sun, FTX’s in-house legal counsel, tells me that his main job is to cement the many deals SBF makes on a handshake. Ninety-nine times out of a hundred, Sun says, the terms favor the other side. It’s another corporate policy derived from a rigorous logical argument: In an iterated prisoner’s dilemma, the best first move is always to cooperate. And, if the counterparty defects, “it’s better that I know this guy will screw me over now,” Sun says, “rather than later.”
I think the part in the following about "collegiate feel" is important; whatever about being young people in a young company, still they are all around thirty years of age, late twenties to early thirties, but they're still living/working in a setup like they're nineteen and in college. That's not how you handle a business dealing with hundreds of millions and ambitions to do even more. It's a fundamental lack of responsibility; yes they may be enjoying what they are doing, making their passion their work, but there comes a time when you have to grow up and do the boring routine adult stuff, like keeping a set of accounts and not be climbing over chairs in the main office to get to your desk:
"I clock in at FTX HQ at nine and clock out at five for most of the week—until, one day, I’m invited to live in what amounts to the FTX dorms. Many employees take advantage of subsidized corporate housing at a nearby development called Albany. The heart of the development is a yacht basin and marina surrounded by half a dozen residential towers. The area is so new that several towers are still under construction. FTX owns a passel of the multi-bedroom apartments in the towers and rents them out as crash pads to employees. There’s a collegiate feel to the whole setup. Indeed, Albany could be mistaken for an institution of higher learning: Behind the gatehouse is everything you could ever ask for on a campus—restaurants, cafés, a health club, golf and tennis facilities and, of course, classrooms."
The scale of ambition was definitely grandiose, maybe even delusional:
"To be clear, SBF is not talking about maximizing the total value of FTX—he’s talking about maximizing the total value of the universe. And his units are not dollars: In a kind of GDP for the universe, his units are the units of a utilitarian. He’s maximizing utils, units of happiness. And not just for every living soul, but also every soul—human and animal—that will ever live in the future. Maximizing the total happiness of the future—that’s SBF’s ultimate goal. FTX is just a means to that end.
...“So,” I summarize, “you are young and vital and peaking at precisely the point where the world is at, as you see it, peak crisis.” SBF nods in agreement, deep in another round of Storybook Brawl. “Does that strike you as just a lucky coincidence, or does that strike you as perhaps a signal that your thinking is flawed and you have a savior complex?”
“It’s an interesting question,” he says, stalling.
I double down: “You just happen to be alive in the most important time in the history of the future race. The existential point! Really?”
SBF hedges: “It certainly would not be one’s prior—at least, not naively.”
“Prior”—that’s a term of art. There’s more math to explain (in this case, Bayes’ theorem), but in the interest of you, dear reader, I will skip it.
“But,” SBF continues, “if you want to really needle on that, there are some anthropic considerations by which that might not be as crazy as it sounds.” With the mention of “anthropic” we’ve reached conversational escape velocity and head into the nosebleed regions of modern metaphysics. Again, I’ll spare you the trouble. Suffice it to say that, while SBF is willing to consider the idea that he might be delusional, as a kind of thought experiment, he ultimately dismisses it.
Game over.
After my interview with SBF, I was convinced: I was talking to a future trillionaire. Whatever mojo he worked on the partners at Sequoia—who fell for him after one Zoom—had worked on me, too. For me, it was simply a gut feeling. I’ve been talking to founders and doing deep dives into technology companies for decades. It’s been my entire professional life as a writer. And because of that experience, there must be a pattern-matching algorithm churning away somewhere in my subconscious. I don’t know how I know, I just do. SBF is a winner.
But that wasn’t even the main thing. There was something else I felt: something in my heart, not just my gut. After sitting ten feet from him for most of the week, studying him in the human musk of the startup grind and chatting in between beanbag naps, I couldn’t shake the feeling that this guy is actually as selfless as he claims to be.
So I find myself convinced that, if SBF can keep his wits about him in the years ahead, he’s going to slay—that, just as Alameda was a stepping stone to FTX, FTX will be to the super-app. Banking will be disrupted and transformed by crypto, just as media was transformed and disrupted by the web. Something of the sort must happen eventually, as the current system, with its layers upon layers of intermediaries, is antiquated and prone to crashing—the global financial crisis of 2008 was just the latest in a long line of failures that occurred because banks didn’t actually know what was on their balance sheets. Crypto is money that can audit itself, no accountant or bookkeeper needed, and thus a financial system with the blockchain built in can, in theory, cut out most of the financial middlemen, to the advantage of all. Of course, that’s the pitch of every crypto company out there. The FTX competitive advantage? Ethical behavior. SBF is a Peter Singer–inspired utilitarian in a sea of Robert Nozick–inspired libertarians. He’s an ethical maximalist in an industry that’s overwhelmingly populated with ethical minimalists. I’m a Nozick man myself, but I know who I’d rather trust my money with: SBF, hands-down. And if he does end up saving the world as a side effect of being my banker, all the better."
> the way he (and it really is majorly down to him) ran the concern was terribly careless
You’ll get no argument from me on that one. This is not your grandfather’s corporate structure, to channel Ringo Starr. At the end of the day I am sure there are any number of laws they broke, but I don’t know much about the laws in the Bahamas re-corporate governance.
The big question is fraud, and it’s there that I hold my fire. If one’s opinion is that cryptocurrency itself is a fraud (an opinion I am open to) that doesn’t really point a finger at him on that score. Fraud requires intent. I don’t think his intent is very clear at the moment. Ponzi dealt with an asset that was universally considered sound -cash.
The hallmark of the eponymous scheme is that it can never make everyone whole, it must keep swimming or die, and when it dies someone is left holding the bag. Crypto as the underlying asset confounds this a bit.
I should read up on tulips; maybe that would shed some light.
I usually get about ten new subscriptions per subscriber-only post. If the average subscriber pays me $10/month and keeps subscribing for six months, that makes me $600. So I guess before writing this I would have wanted $600 to make it public, and I ought to stick to that even after I've written it to avoid feeling like I'm doing a bait-and-switch where subscribers regret subscribing to me because I just make posts public anyway.
I said this before but think about $5/€5 a month. That’s a cost I consider free. I Don’t think about it. I kept Apple TV going at 5€ even in months when I never watched an episode, just because I didn’t care about €5. Now that it’s 6.99 I am not going to keep paying every month - just when there is something to watch. This is not very logical I know but I discount €5 a month but multiply every thing else by 12, and think yearly.
Your free content is excellent and you want to reach the largest audience. That is noble. The extra gained from paying is therefore not that much content. I’d be happy to pay you for no extra content at a lower price.
Another thing maybe could be to allow paid subscribers one days access before the free articles are released.
For this to work, I would need twice as many people as are currently subscribed to be willing to subscribe at the lower price.
Although I know there are some people who would subscribe regardless of whether or not there's free content, I lose about 10% of subscribers per year naturally and I'm trying to figure out ways to convince more people to subscribe.
I subscribed because I saw something referenced somewhere (off-site) saying that your subscriber-only posts included the occasional ama.
I think that your please-subscribe pitch, such as it is, is not either particularly convincing or particularly descriptive. If you're serious about wanting more subscribers, I'd seriously consider hiring a marketing person of some sort, just as a contract job to help you write those one or two lines.
I subscribed for the measured epistemology of your writing, but this post just seems mean. I'm certainly in your tribe but, if I'm going to be an honest data point, much more of this persecuted-by-the-media thing and much I'm less likely to re-up.
Whether it sounds persecuted or not isn't the main point. Maybe half of all things sound justified to half of all humanity.
The main point is how much value I've got-and-paid-for and evangelized from the regular epistemology of your writing. What I want is the high standard you've set up: empathetic, analytic, and urgent.
"Is there a way to say "I'm sad that people are using stupid arguments to try to discredit my friends" without sounding "persecuted"?"
Not really. I won't say it's the only way to handle it, but Hypothetical You are going to get criticised anyway for anything that is perceived as being sympathetic to FTX/Bankman-Fried (see all the commentary about the media going easy on him and 'this is probably because he's a Democrat/he gave tons of money to Democrat politicians/he's one of the rich in their circles').
So go full-on "Some of my friends are caught up in this and I'm sick and tired of them getting called names for shit they didn't do so I'm going to defend them and if you don't like that, tough! Door's over there!"
(Like I said: Catholic. Sex abuse scandal. Been there, got the jokes, oh yeah and the "the reason the Catholic Church is anti-abortion is so that there will be a plentiful supply of child victims for priests to rape" memes, whatever you say isn't going to appease everyone, so feck the begrudgers).
Eh, it's like on Reddit. If someone attacks your post and you zip your lip or give a mild response, the group stays open to your ideas and after a while you get some interested or supportive responses. But if you complain the sub downvotes you to hell.
I missed the point of every single parodied article in Scott's post -- I guess because I avoid almost all the mainstream media. I do not think I have read a single article about EA in the last year, so I have not suffered through any instances of some colunist misrepresenting and mistreating EA. So while I have nothing against this post of Scott's being made more widely available, I actually think it's less accessible than a lot of his stuff, because you have to have a sense of how the media treats EA to appreciate it. And did I see somebody earlier suggesting he edit it to make it appropriate for younger readers? I'm not sure I did. But if somebody did -- I think very few teens are going to be up on what the media is saying about EA, and thence able to appreciate this post.
Nolan E.’s suggestion of allowing paid subscribers early access is, I think, a good one. “It’s all free eventually, but if you help subsidize by subscribing you get looped in quicker” sounds like something people would pay for. I won my current subscription as a prize in the book review contest, but will consider re-upping as needed. Just getting everything hot off the press would have some value, as opposed to free access the next day.
I think the way I would do this, if I were going to do it, would be to have a "subscription drive" post every January, where I say something like "Here are some of the members-only posts you missed out on last year - [link to some previously members-only posts that I'm opening up for everyone]. Here are some that you still can't read without a subscription [links to those]. Get a subscription today and you'll be able to read posts like this as they come out!"
Do commenters write things in locked posts that they wouldn't want exposed to Internet search engines? You've given us no guarantee of privacy, of course, and "put not your trust in princes" yadda yadda.
I know I've tried not to write stuff in locked posts that would be Bad if spread around the Internet, but I have also noticed myself being a little looser-tongued than in open posts. (Part, I suppose, is just that I assume commenters in here will be more charitable and less nuts than out there.)
It would incentivize you to make the members-only posts less topical than the regular ones, so they would remain interesting next January. Which would incentivize me, for one, to continue subscribing.
"Abraham Lincoln, Necromancer" was one of the first things I read by you - it had me literally laughing out loud, and thinking "damn, this guy's smart and also hilarious." I've been reading your stuff ever since and - although I take at least a passing interest in all of it - the more timeless bits are my favorite. By "timeless" I mean either "irreverent take on bizarre historical episodes" or "quirky, funny fiction with some serious philosophical implications" or "galaxy-brained metaphors and speculations about Big Ideas." All of which you do exceptionally well.
On the other hand, if that caused the free stuff to shift towards a primary focus on current events, prediction markets, AI safety, Bay Area rationalism, and psychiatric medications, it might reduce the blog's ability to appeal to an intellectually diverse crowd. Which might, in turn, reduce the number of potential subscribers.
So yeah - I don't really have a recommendation, but those are my thoughts, and I hope you succeed at figuring out the best approach.
I would have expected to you to price it on the basis on how much headache it would create for you if this were to be made public. Just doing the comment moderation on a public post would suck, I expect.
The last one is right though. The fighting racism battle is way past the point of diminishing returns and people who worry about it would be better off working on issues internal to their communities.
How does this compare with the way the media shredded the Sackler family? I acknowledge there are differences -- super rich people lie about the addiction potential of their pharmaceuticals and give mountains of money to museums and opera houses and totally non-EA charities. Not only did the media go after them, The Law went after them with some success.
Some of this is just plain old resentment against rich people. Like Richard Cory, they glitter when they walk. They get worshipful press coverage for their good deeds. By contrast we're nothing, and we don't like being nothing. So when they fall, they get torn to shreds by the herd, we sickly and mediocre puddles of hatred. The rich, their charity and beneficence don't shine so brightly now! Turns out we needn't feel bad about ourselves!
- An idealist group of mostly young white men have formed the Union movement. The claim that they want to improve working conditions for workers all over the world, but their methods have been controversial. Recent events have shown the catastrophic but predictable outcomes of the strategy called "strike-till-you-get-what-you-want" in Union circles. Famous philosophers have idealistically claimed that this method should theoretically be an effective way to improve working conditions and wages. But anyone with a common-sense understanding of real-world human psychology can predict that strikes will attract lazy people who do not want to work and who now can hide behind this fashionable cause. Worse, strikes may cause productive workers to become lazy due to the prolonged inaction. For every worker that strikes for improved condition, there will be ten who strike out of pure laziness. Instead of striking, maybe workers would be better of by encouraging a stoic mindset?
-Another issue for the Union movement is that some of the movement seem less concerned with simply improving working conditions, and more concerned with abstract ideas about perfect utopian societies. Inspired by philosophers, they predict that "socialism" will occur in the future, and that the Unions should prepare for it instead of focusing on improving working conditions now. While this may be an useful idea to think about, it is clear that these fringe theories are scaring of ordinary people who just care about wages and working conditions and who might otherwise want to join the Union movement. The Union movement would better achieve it's goals if the person in charge would ban discussions about "socialism". Those discussing socialism could just form a new movement and the person in charge could make sure that these two movements never overlap. That this hasn't happened already indicates that the Union movement is incompetent, and that it's more interested in theoretical speculation than real-world work conditions.
These echoes-of-Slate-Star-Scratchpad type subscriber posts are a nice cherry on top of the 5% paid content tithe, but for meta-consideration reasons I'm glad you're choosing to keep them paywalled. There's a time and place for half-serious half-trolling optically-spicy punching takes, and...God help me, I think Twitter works better for that than The Blog. Different product lines and all that, even if both are run by the same boss.
(It wouldn't be too much more work to polish this into a suitable-for-all-ages post, though. I think.)
How many indulgences do they pay you to write these comments?
/s
I mean... 3 of these make a good point!
(Love this series of posts so much)
Remember this instead of having Gel-Mann amnesia next time you read something.
https://astralcodexten.substack.com/p/bounded-distrust
Avoiding Gel-Mann amnesia tells me the media is often wrong. But the media isn't infinitely wrong (for example, I do believe there is a person named "Vladimir Putin", and he may even be Russian). So how do I know how much to trust vs. distrust the media. Just noticing that Gel-Mann amnesia is real and important doesn't answer that for me.
You can assume they're about as wrong and in the same manner as they are wrong about whatever things you're familiar with.
Indeed, and it is a sobering thought…
Or in other words, they *do* report about everything this way.
This. People acknowledge Gel-Mann amnesia but never internalize what it means.
I'm not sure how true that is for people who are familiar with unusual things that reporters would have much less knowledge of than the average thing they write about.
Seemingly ... In this case they are less wrong about the facts, and more wrong about the interpretation. This is less bad, because one can come up with one's own interpretation. We should update our assessment of the media accordingly.
Poor interpretation will lead you to present a pretty limited or irrelevant set of facts.
Good point. Also: I miss Michael Crichton.
Why not just ignore the media?
I mean, we're you unaware that Vladamir Puttin existed? Are there other sources you could get this information from? What does the current media give you that isn't provided better from another source?
For example, do you think you'd be better informed about Russia by reading the news or by reading, say, State Department press releases? What does the media do better than government press releases, other than making it more entertaining and more hateful?
I mean, from a sufficiently abstract point of view, all of those "sources" are the media. They're all intermediates between what happens and you. If you want to learn about things outside your immediate experience, you'll have to trust at least *some* media. For certain things it makes sense that the media you trust are the big professional organizations.
Also, to answer your specific question, what the big professional media do better than the government is not being the government, and therefore providing an alternate viewpoint that can run contrary to whatever hidden motives the government may have in showing you that press release.
I'm...not sure I catch this vibe and it feels very...abstract. Very concretely:
You can get your information from social media, like Twitter, which is occasionally extremely good and 90+% of the time utterly horrific.
You can get your information from news media, like the NYT, which has a long and detailed history of yellow journalism and outright deception.
You can get your information from government sources, like the FRED blog, which has hidden government motives.
Or, more realistically, you're consuming some mix of all of the above, plus a host of other sources. Why should the news media, a group of organizations with a really long history of bad behavior and lying be in that mix? If you're scanning a few good Twitter accounts, subscribed to some substacks, have some podcasts, you're already consuming a ton of media. Within that environment, why shouldn't you just stop paying attention to the NYT on, say, abortion and just go listen to the Dobbs oral arguments (1)?
(1) https://www.supremecourt.gov/oral_arguments/audio/2021/19-1392
I don't really have thoughts about the NYT, which I don't read and never regularly have. But professional news media does tend to have certain properties that are interesting, such as access to certain places that a random substacker probably wouldn't get to in high-level politics or business, etc. They also tend to have a lot of resources, which also gives them capabilities that smaller sources don't have. And they're professionals at synthesizing information in a digestible way, which is often more practical than reading the original sources.
There's also the sense in which, yes, news media have perverse incentives to maximize their readership, reputation, and revenue to the detriment of truth, but paradoxically you can trust them somewhat *more* knowing this. Compare a state-owned newspaper with a private for-profit one. Both distort information, but the for-profit one (maybe!) doesn't have malicious hidden motives, they just want to make more money.
All that said, I totally agree that a lot of the mainstream media isn't really trustable, and if you have reasons of your own not to trust an entire org such as the NYT, banning them from your media mix is perfectly reasonable.
>I mean, we're you unaware that Vladamir Puttin existed?
I certainly would be if I ignored all media! I've never met him in person, nor have any of my close acquaintances, so where else would I have heard the name?
Reference books?
I think the best solution I have found is to locate a conversation, typically online, where intelligent people with a wide range of political (and perhaps religious and scientific) beliefs argue with each other. You read a conservative who you have found to be intelligent writing about some issue and you probably have the best arguments for the conservative view of that issue. You read a leftist similarly. You then evaluate the arguments for yourself.
Gell-Mann*
I know. This article pissed me off sooo much! ( https://newrepublic.com/article/168885/bankman-fried-effective-altruism-bunk )
Like it couldn't decide what it's argument was other than to diss EA. I mean, if there's this bad rich guy then surely the more of his cash you divert to helping ppl the better right? And it acted like flattering rich people's ego's was somehow a particular problem for EA as if fancy thousand dollar a plate dinners, offering large donors leadership positions and otherwise fund raising by via flattery wasn't a thing for every last charity.
I have some theories about why this happens but I'll put that in a seperate comment.
Thing is, it wasn't *his* cash but "all the people who invested with my fraudulent scheme" cash. That's what is getting him in trouble.
I have no particular animus against EA but having a prominent swinder (let's just slap "alleged" all over this comment so that lawyers don't come after Scott, it's not him it's me your honour) rook money out of people in part by playing the charity card is going to look bad to people outside.
Take this from the Wall Street Journal, representative of the media coverage:
https://www.wsj.com/articles/sam-bankman-frieds-plans-to-save-the-world-went-down-in-flames-11669257574?reflink=e2twmkts
"Run by self-described idealists spending the wealth of their billionaire patron to make the world a better place, Mr. Bankman-Fried’s FTX Foundation and its flagship Future Fund touted deep pockets, ambitious goals and fast turnarounds.
Now Mr. Bankman-Fried’s fortune has disappeared, and the self-described philosopher-executives running the organizations have resigned. Grant recipients are scrambling for cash to plug the shortfall and fretting about the provenance of FTX’s largess after the company’s lawyers said this week that a “substantial amount” of assets were missing and possibly stolen.
Mr. Bankman-Fried often claimed philanthropy was his primary motivation for amassing a fortune. “It’s the thing that matters the most in the end,” he said in an April interview on the “80,000 Hours” podcast.
Will MacAskill, then a philosophy graduate student, pitched Mr. Bankman-Fried on the idea of effective altruism, a way of applying some utilitarian ideas to charitable giving.
“Effective altruists say you should use reason to compare causes and find the thing that can get you the highest return,” said Stanford University political science professor Rob Reich. “Giving to [elite universities] or the art museum is a much lower return on your charitable donation than giving to an antimalaria nonprofit that looks to prevent easily preventable deaths.”
Mr. Bankman-Fried had considered different career paths, he said in the “80,000 Hours” interview, but Mr. MacAskill suggested he could do the most good by making a lot of money and giving it away, a popular idea in the community.
Mr. Bankman-Fried set his sights on crypto, founding trading firm Alameda Research in 2017. He launched FTX a few years later. The price of bitcoin and other digital currencies surged, helping FTX become one of the world’s five biggest crypto exchanges. He soon helped establish the FTX Foundation, Future Fund and a family foundation, Building A Stronger Future. "
This is rather sympathetic coverage, but somebody could come away from that wondering how the heck this altruistic philosophy apparently left out the bits about "don't steal, don't cheat, don't swindle and don't lie". At best, he wanted to get rich quick to do good and took a lot of ill-advised shortcuts to do that. At worst, he used the high-falutin' notions as a cover for stealing.
Now EA has to deal with the fallout of "okay, one rotten apple". I mean, I'm Catholic, welcome to the way media coverage works when there is a huge scandal associated with you and yours, especially when morals/ethics/principles are involved.
Probably unfair of me because like I said, I don't have any particular animus about EA, but I'm enjoying all this coverage because, well, that was a juicy one, exile! It's great fun to read a big story like this when it's not your ox being gored.
Well Scott's safe under DMCA section 230. To address your main argument I certainly agree that EA needs a way broader donor base and to get less of it's money from one or two billionaire's who might turn out to be unsavory (tho it still seems way better that they give even stolen money to charity than spending it on themselves).
But that's what's so frustrating about the coverage. If the argument was that EA is too reliant on a few big donors well fair criticism (tho not clear how actionable it is). But the arguments always seem to slide from: EA is too associated with a few big sketchy donors to an implication that the cause itself is somehow suspect.
And it's always been the way that sketchy rich ppl launder their public images by charitable donations and whenever it happens with other charities the attitude taken seems much more like mine: well it's good that money went to great causes.
Sure, if there were a bunch of givewell ads or propoganda featuring Sam then that criticism would be fair but it seems like what happened is that the journalists latched on to Sam's self-promotion as a big EA person (getting no more than the usual official recognition of big donors from EA institutions) and is now acting like EA should have refuses the funds or somehow stopped the journalists from making the association.
Yeah, it definitely seems like SBF and the FTX implosion are just a hook to regurgitate the same arguments that people were making about EA a month ago (and the month before that, and ...). Like "this rich guy who was a prominent EA donor and advocate turned out to just be running a giant Ponzi scheme which just goes to show why my argument that we can't know anything about the world so trying to figure out what is most "effective" will lead us to disaster is completely correct for some reason".
Yeah, guys, but you still have to address the argument - and it may be an unfair argument, but this is the rod EA cut for its own back - that "EA was all about doing good better. That is, they were going around telling everyone that all the other charitable stuff was rubbish, everyone else was doing it wrong, you should put the welfare of strangers over those nearer to you, and it was all backed up by statistics and the only correct philosophy. Well, *now* look what your star pupil has gone and done. But *I'm* the idiot for putting ten bucks in the collection plate in church instead of handing it over to you lot?"
EA says 'we've figured out a better way to do good'. EA then gets mixed up with a guy who was conning everyone out of their socks. EA wasn't so smart, was it? And if it says it can figure out things better than ordinary people because it doesn't let emotional attachment or sentimental associations affect what the cold equations of efficacy tell it, then it should be held to a higher standard. But it turns out the EA movement was just as easily suckered as the ordinary guy in the street, so yah boo sucks to you!
> That is, they were going around telling everyone that all the other charitable stuff was rubbish, everyone else was doing it wrong
I mean, no it wasn't. Maybe you have an example of someone in the EA world saying this but I have never seen anything that can be interpreted in that way unless you take the position that literally suggesting anything is equivalent to saying anything you didn't suggest is garbage.
> Well, *now* look what your star pupil has gone and done
Rich guy who very publicly gave a lot of money to charity and used it for PR turned out to be a scumbag. Not exactly some novel failure of EA in particular. I mean charities accept donations from people. Sometime those are bad people. Sometimes those bad people use their charitable giving to launder their reputation. It's bad and I think the EA community should absolutely learn some lessons from this fiasco, but sort of mundane lessons about the relationship between any institution and it's stakeholders.
> EA movement was just as easily suckered as the ordinary guy in the street, so yah boo sucks to you!
Were they suckered? Did they claim any particular expertise in evaluating the solvency of cryptocurrency exchanges? Almost every one of the world's most sophisticated investors got duped by SBF and FTX so I don't see it as particularly discrediting or surprising that a bunch of guys who run charities failed to crack the case.
More to the point, does any of this change our evaluation of how effectively they were actually using SBF's money? Not sure I see how.
EA promotes the idea of being very thoughtful about the causes and organizations you support.
It doesn't support the idea of being equally thoughtful about your donors, though perhaps it should, at least for the large donors.
It seems to me that a *lot* of people got taken in by FBS, many of whom had more capacity and responsibilty to check up on him.
The unfairness of the coverage is more to do with the assumptions about the influence EA and/or utilitarianism had on Bankman-Fried.
Utilitarianism is taking quite the bashing, even more so than EA. The same breathless, syrupy admiration about this being his motivating philosophy is very easily turned around into condemnation.
Utilitarianism helped 12 year old Sam figure out that his parents, his friends, and all those around him who were pro-abortion were right! How wonderful!
But that same utilitarianism also helped 30 year old Sam figure out that robbing Peter to pay Paul was perfectly okay, too. How dreadful!
I do feel sorry for Will MacAskill, who is getting portrayed as Bankman-Fried's guru - and when the pupil goes astray, you have to look at what was the master teaching them? (But not *too* sorry; he's the guy who, as one comment in another thread suggested, said he didn't have kids because they would interfere with his work. That work that is so important for the world. Well, now him and his work are getting a black eye all over the place, makes the distractions a couple of kids might have caused him look a lot better, eh, Will?)
"Will MacAskill, then a philosophy graduate student, pitched Mr. Bankman-Fried on the idea of effective altruism, a way of applying some utilitarian ideas to charitable giving.
Mr. Bankman-Fried had considered different career paths, he said in the “80,000 Hours” interview, but Mr. MacAskill suggested he could do the most good by making a lot of money and giving it away, a popular idea in the community. "
He's one of the founders of the EA movement...
Presumably MacAskill just doesn't like children but doesn't want to say that in interviews. That's a funny thing to hold against him.
If he doesn't want kids because he doesn't like kids, that's a perfectly fine reason. Phrasing it as "I am Too Important, my time is Too Valuable, the world itself would suffer were I to be distracted by the demands of fatherhood" just sounds like you are an insufferable swelled-head with delusions as to your ultimate grandeur. You could be hit by a bus tomorrow, Will, and the world would stagger on without you.
Will MacAskill literally writes in his book 'What We Owe the Future' that one of the things people should consider to make the future better, is to have children.
I don't know whether he personally has plans to have kids, but the idea he thinks "I am Too Important, my time is Too Valuable, the world itself would suffer were I to be distracted by the demands of fatherhood" is a straw man in the extreme. Where does this idea come from?
"You could be hit by a bus tomorrow, Will, and the world would stagger on without you." This is an incredibly poor taste thing to write on a public forum and I read it as very close to "I wish you dead". It's the kind of comment I'd like to see moderated but I'm new here - is this kind of thing acceptable here?
What's wrong with "things other than kids fill my time in ways I find meaningful"? Things go wrong with work, things go wrong with kids too. A lot of people have really dissapointing relationships with their kids.
The cause is suspect because it is a cause that appeals to sociopaths. Treating the life of some child in the Congo as of equal value to the people in your neighborhood is sociopathic behavior. Or whatever label you want to slap on it. It as a philosophy that appeals to people high on rationality and low on traditional human connections and values. And I can identify it easily because I am one of them.
That may be part of resentment but i don't think that explains it all. You also need the feeling that they are saying they are better than you and that it feels plausible enough to make you feel a bit bad about it.
I think most ppl don't feel the congo kid has equal value but feel really bad about saying that out loud because of the values we hold up as a society. EA's are those weird ppl that are kinda creepy who force them to face this tension and that creates resentment.
Like I think cryo or belief we live in a simulation has a similar profile for who talks about it but tends to just generate a feeling of "look at those weirdos" (unless u are in a personal rel with someone who believes) and maybe crit for wasting money but not the same schredenfraud (not even going to try spelling it right) bc it doesn't create any feeling of guilt or self-doubt.
Ah yes, the classic sociopath behaviour of... making significant personal sacrifices in order to help others?
I know you said 'or whatever label...', but 'sociopath' is too loaded a word to radically redefine and then apply to people who don't fit the normal definition. Caring about others and taking altruistic actions to help them is about as far from sociopathic behaviour as you can get. (And most non-EAs do very little to help the people in their neighbourhood anyway.)
>Ah yes, the classic sociopath behavior of... making significant personal sacrifices in order to help others?
I would rather frame it as the classic sociopath behavior of so little valuing your actual human connections and relationships that you can somehow reason yourself into the position that distance (physical and/or social) doesn't matter ethically. its the position of someone who doesn't have normal human feelings.
>most non-EAs do very little to help the people in their neighborhood anyway
Most everyone does. But not most people fired up about helping others, which is the group of people we are taking about and EAs are trying to proselytize to.
Wait, how does donating 10% of your income to global health charities trade off at all against having warm relationships with your friends and wider community? I think a randomly selected EA would mostly likely advise me to be doing both. Are you sure you're not arguing against a straw man here? I will admit to thinking that physical distance has no intrinsic ethical relevance, but it's strongly correlated with social distance, which is indeed ethically relevant, as you pointed out.
The media *does* report on other things like it reports on Effective Altruism. Namely, Republicans. Welcome to the outgroup.
I’m reminded of the “stochastic terrorism” line of reasoning. Some ideas are just so dangerous we can’t responsibly speak them out loud: trying to do good most efficiently, because easily influenced minds might be corrupted into billion dollar fraud.
Saw the NYT holding Chris Rufo responsible for the latest shooting; felt pretty wild.
Oh, the "stochastic terrorism" line is a great example. I saw it popping up all over the place and I was mightily puzzled because of cloudy memories of stochastic equations in chemistry, could not figure out what that had to do with terrorism.
Handily, Dictionary.com promoted their definition half a day after the Colorado Springs nightclub attack.
https://twitter.com/Dictionarycom/status/1594405217851916291
I'm sure they do the same after 'leftwing-coded' attacks. /s
What are some 'leftwing-coded' attacks you think would be good examples of this phenomenon? What word should the dictionary promote after such an attack if it wants to balance out promoting "stochastic terrorism"?
Statements by left-ish journalists, activists, and politicians promoted and fanned the flames of the riots of 2020. That seems like a paradigm example of stochastic terrorism on a very significant scale.
Another example would be the rhetoric of the Nation of Islam—an organization Democratic establishment politicians still suck up to. Farrakhan’s bizarre vitriolic anti-Semitism is echoed in events like the Jersey City deli shooting some years ago, along with frequent lower-grade acts of anti-Semitic violence by Black Hebrew Israelites and sympathizers. (Maybe this tendency will realign rightward, though—the bizarre ideology now includes Kanye among its adherents.)
For what it’s worth, I don’t use the term “stochastic terrorism”, I think it’s loaded language promoting the idea that “speech is violence”, a notion I despise. But if I were to adopt the term—it’s obviously not limited to the right.
Is it really that puzzling? Specific examples of whether or not some violent act was because of stochastic terrorism can be debated, but it's very obvious when Tucker Carlson goes on TV and says hospitals are mutilating children and performing illegal surgeries (which they aren't doing), and then hospitals see floods of threats of violence coming in that the two are related. You could either argue that threats of violence are entirely unrelated to actual violence and so the actual terrorists are entirely unrelated, or you could argue that Carlson didn't actually intend for people to threaten hospitals. I think you'd have a hard time arguing either of those.
I usually put the argument as "Some ideas are both true and dangerous. That some ideas are both true and dangerous is one such idea."
Ok, lets just start like this:
Do you know what the steelman description of 'stochastic terrorism' is, and do you doubt that it actually exists and is important?
Also, let me ask you this: why did lynchings of black men used to happen fairly regularly, for a set period of time in a set part of the world, and not before or after or in other parts of the world?
1. I think so
2. I don’t doubt that the phenomenon occurs that people speak in opposition to other people in bombastic, violence laden rhetoric and other people do violence based on that. The level of importance is unclear to me when weighed against stuff like being able to criticize cultural practices you find immoral. Any proposed solutions I’ve seen to stochastic terrorism seem worse than the illness.
I don’t think Chris rufo is a good faith actor but holding him responsible for the latest shooting seems implausible. On a personal level I feel pretty weird about drag personas interacting with young children. Based on my own non-zero experience hanging out in gay bars and around drag queens, I see nothing redemptive in this cultural project. If Chris rufo wants to criticize it in bad faith, eh. Live and let live, as my fellow liberals used to say.
I really don’t follow where you’re leading here, but I’ll play: organized and ad hoc white supremacy? The KKK plus white land owners wanting cheap labor and white laborers feeling threatened by slaves and newly freed blacks harming their bargaining power creating incentives for animus along racial lines?
"why did lynchings of black men used to happen fairly regularly, for a set period of time in a set part of the world, and not before or after or in other parts of the world?"
How do you know lynchings or other extra-judicial killing of black men (and white, yellow, red and brown men) did not happen "before or after in other parts of the world"?
The very derivation of the term "lynching" and "lynch law" comes, ultimately, from an Irish source:
https://en.wikipedia.org/wiki/James_Lynch_fitz_Stephen
The story is that a mayor of Galway, surnamed Lynch, hanged his own son for the murder of a Spaniard because the young man was so popular with the people that no-one else would carry out the sentence. This got brought along to the USA where different lawmen named Lynch were supposed to have carried out hangings, and eventually the term "lynch law" for vigilante justice was adopted as the common term in use.
Are you going to maintain that:
(1) Only people in the USA were ever lynched (or, as I said, otherwise killed by mobs)? That's not so.
(2) Only black men in the USA were ever lynched? Again, that is not so.
But if you mean "mob murder used as a method of terrorising the black population of the American South, amongst other methods of maintaining a racial disparity and subjugation", then yeah, that happened. I don't know the history of black people in the North and if any of them were ever lynched or otherwise unlawfully killed.
So if you are defining "stochastic terrorism" to mean practices like that, I might go along. But "stochastic terrorism" has been used to refer to parents speaking up at school board meetings, as is their right. If the person using the term wants every classroom to be draped in rainbow flags, then they can make that case - but they don't get to call people who oppose that 'domestic terrorists' or 'stochastic terrorism'.
>But "stochastic terrorism" has been used to refer to parents speaking up at school board meetings
By who? Where?
Please share a link, and if your description is accurate, I will join you in laughing at the idiot who misused the term in such a way.
But the existence of idiots doesn't undermine the reality that the term refers to, which I think we agree on (yes I'm referring to the specific US thing).
It’s funny because it’s true. I wonder if this is just an outgroup thing, as the other commenter said, or if there’s something else at work here.
So an interesting question is why does EA draw this kind of coverage and I have a start of a theory.
I feel that by raising the issue of how effective a donation is EA makes a certain kind of person feel guilty. Before they'd felt good about themselves for any charitable donation but now once cogent arguments are made that some donations are better then others they feel guilty for not putting in the time to figure out the best donation. You can try and tell them that's not the point and no one is suggesting you feel bad about charity but that doesn't stop the feeling of guilt for many people.
And people resent having something they felt good about suddenly make them feel guilty. A common reaction to feeling guilty because of criticism is to lash out at whoever made you feel that way and attack their virtue -- even, or perhaps especially, when the guilt is really coming from your own self-criticism.
I wish I had a good fix for this since it would be a huge boon for EA to avoid inducing this negative affect. To a degree you can minimize it by emphasizing that you can just let givewell figure out what's best but that still leaves some people feeling bad about their emotional desire to donate to help starving dogs or whatever and, even if you insist they shouldn't, the problem is the argument that it does more good to donate this other way is going to make a certain sort of person feel bad if they don't donate that way.
And (though this is much more speculative) for some people, I think there is also an element of who the criticism is coming from, e.g., there is a feeling that these nerds with calculators are invading the domain of journalists and less quantitative types and setting themselves up as arbiters of social value.
Hm, I doubt this. For the simple reason that most people have no clear idea what EA actually is, and roughly zero emotions about them. I doubt very much that they feel guilty because of EA ideas. If they have heard of EA at all before the FTX incident (and not because they investigated the background of FTX), then it is just some of the many many weird group of people doing weird stuff.
A much simpler hypothesis is: this happens every time when something goes wrong. Certainly in cases like this when someone did something evil. But even when something goes wrong without anyone's fault. I have seen it a zillion times. But usually we don't care, because we are not personally attached to the group that is treated unkind.
By the way, I don't think these unfair treatments are a special feature of the press. It's a general human trait to assume that if something went wrong, then people somehow related to that must have done something wrong. It's just that reporters write it down.
I think it's a lot simpler. The media are plugging the EA angle because Bankman-Fried made a big deal out of his philanthropy. If he said he had been influenced by, I dunno, movies about Santa Claus as a kid they'd be all over that one.
This is someone who claimed a virtuous motive but was (all the time? towards the end?) acting viciously. It's the ever-popular hypocrisy angle. Every megachurch preacher who ever ended up in bed with a congregant's wife, or with his fingers in the donation box, or in a motel with a rent-boy gets the same treatment: man of public virtue lives secret life of sin, was it all a con job from the start?
The whole angle of "utilitarianism" and "effective altruism" is strange and unfamiliar enough to the majority of people that it's a novel angle to explore. How could a guy who was being mentored by/working with Big Names in an ethical movement turn out like this? How could the Big Names not realise what was going on? It's the religion angle but with a secular flavour this time round.
Sure that's part of it but see articles like this one: https://newrepublic.com/article/168885/bankman-fried-effective-altruism-bunk
It's not just that they are all over the fact that a rich guy who turned out to be sleazy made a big deal of his EA giving (that's fair) but there seems to be an active desire to suggest that EA is somehow rotten because it took money from ppl who wouldn't have done good things with it and spent it doing good things.
I agree there is something to what you say. But I still think it's treated differently than other kinds of charities where slimy rich ppl using them to launder their reputation is basically seen as the cost of doing buisnesses is the following:. bc ppl see EA (wrongly imo) as claiming a kind of higher moral status they then see this as a kind of karma (much like how religious orgs get held to higher standards than other orgs).
It's a perfectly reasonable response if you understand EA as looking down on ppl for their less virtuous giving but that's a mistake. But an understandable one.
Absent a movement of santas making people uncomfortable due to their insistence on effective gift giving (“give gifts to the nice, and fulfill the basic energy needs of the naughty”), I'm not sure the media would go at the santa movie angle. “Those people who made you feel bad about yourself? We're taking them down a peg!” is a necessary ingredient.
But not enough people know about EA to be made feel bad about themselves. Most people's reaction to this news is going to be "There's a what? Called what? That do what?" and thinking it's some weird California techie thing. Not many people knew about the earnestness (at the start) about giving at least 10% and many religious people in the US tithe anyway and wouldn't consider this a reason to feel bad about themselves, neither about the "help poor children in Africa with mosquito nets" thing.
*If* you were aware of EA and *if* you thought you should be donating to good causes and *if* you weren't donating, that *might* evoke the "these people are making me feel bad, I don't like them and want them taken down a peg" reaction. But that's not a lot of people.
I do think the simpler explanation is that this is the old scandal of hypocrisy: guy who claimed to be doing good was in fact a liar and thief.
I certainly agree that the negative coverage of Sam FB (or whatever his name is) is completely explained by what you say. It's why the whole lance armstrong thing was such a feeding frenzy. But that's not what needs explanation.
What didn't happen with Lance Armstrong is anyone writing stories about how cancer charities he supported or stared in ads for where somehow bad as a result. Every time we learn some guy who is a big proponent of metoo is shown to be a creep we don't get stories implying that there is something wrong with metoo.
But (such as this link I posted in other thread: https://newrepublic.com/article/168885/bankman-fried-effective-altruism-bunk ) there have been a number of articles that seem to directly target EA as bad. This difference is what needs explanation.
The relevant ppl here are the journalists writing about it. The phenomenon that is being explained is the attitude taken in the coverage of EA and every one of those journalists necessarily is aware of EA or they couldn't be writing about it.
I agree with you (also, seems that other people on this thread almost entirely agree with you too, though they phrase it differently). I think people feel bad when some person X suggests that he is "holier than thou" - it happens with religious people too, where they cause a similar aversion in the non-religious (or different-religious) and people revel in stories where they violate their own norms. It's not enough that person X be discredited, because their ideology is what makes you feel guilty.
Yeah, hypocrisy is very easy to condemn because it doesn't require any shared ethics whatsoever.
But I do think there's also the aspect of pulling down someone who's got too big for their britches, as it were. When so-and-so is putting on a posh accent, putting on airs, acting white, pretending that they're above it all, I think there's some sort of basic monkey instinct to drag them back down into the muck with the rest of us. To prove that they're no better than anyone else, that it was all just lies and flim-flam, that anyone who gets ahead is doing it by dirty dealing. That we're justified and righteous in our own mediocrity, and that our lack of similar success is not our fault, but in fact due to our virtues. It's all well and good to get ahead, but woe upon those who do it while proclaiming to be better than other people.
I think you’re on to something here.
“Making a structure with data that does feelings better than regular feelings” is another angle. As if the nerds-with-calculators - who are assumed to lack emotions - used numbers and data structures to make a model of caring, and then used their CaringData to beat everyone else at their own game. Presumably then regular people have nothing left; the nerds are smarter, and CaringData gives them better emotions.
The amount of the fraud is spectacular. But the feeling is still weirdly innocent. Like a crime boss pretending to be a perfect person while doing dirt. They did a lot of morally ambivalent or much worse things - while waving the Emotional Perfection flag. Usually people who do dirt eventually embrace the ambiguity, enjoy the suits and cigars, know the lies are lies. I feel like the conference rooms where the fraud was planned probably had organic fair trade coffee. It’s crime without a crimey feel. It makes the facade of innocence potentially more ridiculous.
Yah I think there is alot of truth in that.
Most people when faced with the evidence that this charity works better that charity would - change donations to the better charity. You second point might have some validity though
lots of ppl also like to donate to charities which tug on their heartstrings. If ppl were disposed to behave like you suggest they would already be EAs
Also ppl can get the msg that they should be picking the best one and feel they should do some research but feel guilty when they never get around to it. You see this alot with voting.
I'm surprised to hear you say that and want to explore more about where our models of the world differ. Do you think that most people giving money to their college's alumni fund or their church's soup kitchen simply have never considered that giving to Against Malaria Foundation might be better?
While I disagree with Nolan's claim I'm not sure most ppl have really made the comparison between charities like you suggest. Sure, they may have looked at overhead but I'm not sure most ppl think of charity as comparable in thst way in the same way most ppl don't rank the relative benefits of doing favors/nice things for their various friends/relatives. Sure, they will think at some point: John and James are rich enough they can just hire movers or the like but within a large region no comparisons are made.
Indeed, I think many ppl think of charity as a kind of extension of our normal interpersonal assistance where we usually think that ppl should be disposed to help those they have connections with more. So rather than ask which charity is more effective they ask which cause they feel more connected with (eg if their sister died of cancer or dad died of malaria thry choose thst charity).
Hmm. Despite saying most people I might be talking about myself.
However here in Ireland, after some scandals, where it became clear that most money went to staff and executives in some charities — many of the latter living it up - there was a general movement away from those charities to charities that spent more on recipients than staff. I was part of that. I realise that I was merely changing to charities that were probably more efficient and probably less corrupt bot not certainly more effective.
I try to look at established charities with skin in the game as well, for instance I like Medicine Sans Frontiers. In that case staff costs are high but it’s mostly doctors.
EA, I heard about here first. However then anti malarial net intrigued me so if there was a way to donate to similar charities I would be interested.
> "A common reaction to feeling guilty because of criticism is to lash out at whoever made you feel that way and attack their virtue -- even, or perhaps especially, when the guilt is really coming from your own self-criticism."
Or you could call it "EA Fragility" and write a book!
Ha ha, that's pretty funny.
I think you need to recognize that Ineffective Altruists (donors of church picnics and dog shelters) are the outgroup of Effective Altruists. Non-Altruists are just some far-group. You can recognize this by noticing who annoys you more (if you're sympathetic to EA) - the person who donates to the dog shelter (or donates their hair to cancer patients), or the person who doesn't care much for charity. Who would you feel the urge to argue with and defend against?
EA needs to find a way not to alienate these people, and it's very hard, maybe impossible, not to alienate your outgroup without losing your identity.
Many good points given the superficial and unfair criticisms of EA/longtermism.
That being said, while EA does a lot of good, it suffers from many genuine structural problems media do not mention. EAs are occasionally aware of these, but mostly consider them as negligible/intractable. A friend of mine mentions naive, antisocial utilitarianism attracting Machiavellian types (not that properly contained utilitarianism is bad), partisanship and political bias way exceeding what's necessary to operate on the nonprofit scene, general mistreatment of volunteers, interns, and low-ranking employees, dynamics creating vicious loops worsening mental health, self-recommending tendencies, lack of transparency leading to distrust in trendsetting institutions, unproductive credentialism/elitism, and major loopholes in cost-effectiveness models.
In the end, "sort-of-EA-adjacent-people" become a better match for Great EAs than Prominent EAs.
Watching the media dogpile on EA just made me like EA more.
The reason AE is taking so much heat is that its richest and most visible proponent a) claimed to know how to make the world, in general, better (AE principles), but b) actually made the world a lot worse (by actually being a cynical crook). It is obvious why this would result in negative press for AE and why people would update their priors in a way that is unfavorable to AE. If the head of Mothers Against Drunk Driving stole several billion dollars, people would also update their priors against MADD. What is actually interesting about the media coverage is not that it turned against the movement of crypto Bernie Madoff but that it was so credulous in the first place about a guy who claimed to be donating all his money to AE charities but actually owned a jet.
I mean, that is fair enough. What annoys me though is not people arguing that EA institutions have serious problems or are corrupt so much as the arguments that the entire philosophy of EA is somehow discredited. To use your example, it's like if the head of MADD stole a bunch of money and people used that to argue that drunk driving is actually fine and we were wrong to ever worry about it.
Altruism, Effective
1. I'm not 100% sure I disagree with the political activism one.
2. Wasn't that Mark Zuckerberg one an actual article?
Great post but why subscriber only?
Because if public this would turn into a bunch of petty shit-slinging and be a huge headache.
The shit-slinging is already happening, which is why Scott wrote the post to begin with. Shouldn’t EAs and EA-sympathetic people respond to it?
Yeah we probably should. Not with posts like this, though. Response should be thoughtful and charitable. It's also ok to vent with a sympathetic crowd.
I disagree there. Responses to good faith criticism should be thoughtful and charitable. Response to bad faith criticism and mockery should be snark and trolling.
Detecting the difference between good faith and bad faith is very very hard, and the "if they're serious then they're so negligent it might as well be bad faith" doesn't, in my opinion, actually get you out of that problem. I would guess most of what's happening is "there's a current event and a thing I want to say and they have the same word near them, that's good enough for an article," not "how can I dishonestly ruin EA's day?"
I agree it’s really hard to tell. I just don’t think the solution is to always assume good faith as a result. There’s a cost to both Type 1 and Type 2 errors, and IMO EAs heavily bias towards trying to assume and respond in good faith, even when it’s not optimal.
I’d guess that the amount and quality of the shit-slinging would tend to be worse with a public post.
I admit I don't pay a lot of attention to EA beyond what I read around here, but my impression so far is that it's basically a fairly trite set of ideas espoused by people who don't understand the world very well and don't get the point of the old saying that the road to hell is paved with good intentions. But I do agree that some of the recent FTX/SBF-related articles that are critical of EA are superficial and silly.
Do you agree with them? Do you think more than 5% of people in the US agree with them? If not, I don't think they can be described as "trite", except in the same sense where Christianity is "trite" because people have been doing it for a long time.
I think people actually *don't* agree with the only thing that is unique to EA, which is the set strange and allegedly scientific assumptions it overlays on top of standard outcome-oriented philanthropy.
See https://astralcodexten.substack.com/p/effective-altruism-as-a-tower-of
If everyone agrees with the claim "it's morally important to donate 1-10% of your income or some other equivalent amount of some other resource to whatever charity you think is most effective at decreasing suffering / increasing utility", how come so few people (I would guess <1%) do this?
I think this is the main claim of effective altruism, any additional strange assumptions are just squabbling about implementation details from within the movement.
I think if your only definition of effective altruism is that you have to donate a small to large portion of your income to charities you think are good then you are defining the movement in a way that is deliberately so broad as to make EA unworthy of having it's own name. It would just be charity at that point. You of all people should not be hiding in a Motte. Strange and unpopular ideas like focussing on IA risk, discounting future harms very highly, and quantifying the utils of a chicken sandwich aren't squabbling over the details. They are the only things that separates it from the standard concept of charity, which is the thing everyone actually likes.
I specifically said 1-10% because Peter Singer says 1%, Toby Ord says 10%. I think less than half of people give 1% and less than a twentieth give 10%. I also said "to charities that most effectively reduce suffering/promote utility". Unless you have some kind of galaxy-brained argument, that excludes churches, colleges, local food banks, theaters/operas/symphonies, and most other things Americans donate charity to. I think only between 1-5% of people fit these criteria depending on whether you use Singer's 1% number (5%) or Ord's 10% number (1% or less).
I know everyone wants you to believe that EA is only about speculative AI projects, but about 2/3 of EA donations still go to global health, malaria, things like that. There are many people in EA who have never touched animal welfare or AI risk.
I’ll try to steelman this a little, as someone who only makes it up two and a half levels on the EA tower.
There IS definitely novelty and value in the idea of systematically focusing on outcomes and donating 10%. The problem is that, in the real world, the EA movement is associated with the entire tower, not just its lower levels.
The reason there’s no sympathy for EA is that they’ve begun to focus on things that have huge and obvious failure modes. Think about the concept of “earning to give.” There’s natural skepticism to that because it just sounds too much like a justification someone would give for their own extreme selfishness. For every person who would truly earn to give, there are a dozen who would say they’re doing that as an excuse to enrich themselves.
Another example: The issue of AI Risk. At the end of the day, investing in AI risk means funneling massive salaries to PhDs, i.e. highly-educated elites. That’s terrible optics, because suddenly EA goes from “giving to the poorest in the world” to “paying rich people’s salaries” and it doesn’t matter what the justification is, it looks really bad and stinks of motivated reasoning to outsiders.
And to be clear, I’m not saying AI Risk isn’t a real thing we should worry about. I’m saying that no one should be mentioning it in the same breath as global health. That’s a fine-line to tow, I know it’s complicated when so many members care about both those things, but I still believe it was a mistake to add more levels to that tower.
Before 2016, ask the average college student whether "all lives matter." Most everyone would have said yes to that general sentiment. But now, those words have become particularized - loaded with specific, additional, controversial content - and it's harder to get a "yes".
Similarly, ask anyone whether they agree with: "giving some of what you have to those in need is good." Sure, most do! But that doesn't mean they have to agree with your "giving 1-10% to charities that most effectively reduce suffering", given the specific controversial consequentialist loading you've given to "effectiveness". Even if it seems like a mere rephrasing, the introduction of loaded language will still lead people to reject the latter sentence. There's no inconsistency there - most people just aren't consequentialists.
You've become unable to understand people who think sufficiently differently from you. I think once you would have tried to understand and fix them; now you rant at them, impatient with the fact that they haven't just started giving their 10% already. And I have become impatient with your impatience; I think I'll unsubscribe. It's for the best - the bad outweighs the good in subscriber-only posts. I'll miss the poems, but I'm not going to pay money to get lazy ragebait in my inbox, there are more than enough news sites offering that service for free.
Good intentions don't always lead to hell. How can we convert good intentions into the least hellish outcome? Perhaps some movement that attempts to think hard about all the possibilities, constantly seeking constructive criticism.
This is kind of how I think of EA. I had decades of frustration around giving. I was low income but knew that there were places in the world where even the meager amount I was able to give could have a positive impact. And I wanted that. Discovering GiveWell was the answer for me. I was sick of looking into charities that attempted to provide clean water or education, only to find that they were ignoring local customs and infrastructure limitations, and creating more mess than happiness. EA did indeed seem to convert good intentions to good outcomes instead of hellish ones.
>Young people should resist the lure of political activism and stick to time-honored ways of making a difference, like staying in touch with their family and holding church picnics.
Based and clear pilled.
(I'd guess the media is unfair to EA sometimes mostly because it's still fairly novel and doesn't have a "this is just normal and how things are" vibe yet. But as others have pointed out, the media is often unfair to anyone it considers a threat - an EA is implicitly a pretty big power grab)
Clear pilled?
I believe the reference is to https://americanmind.org/salvo/the-clear-pill-part-1-of-5-the-four-stroke-regime/
(disclaimer that I do not agree with or support Yarvin, I just happen to know the reference)
It’s good that you’re trying to be helpful but some ideas are just too dangerous to share, like effective altruism or Curtis yarvin.
(This is a joke, but made uncomfortable for me because I think Curtis yarvin borders on “stuff you shouldn’t put into your brain because it makes you a worse person.” He’s like a sith version of an ayahuasca shaman.)
Actually, it might be a good idea to expand a bit on Incanto's interesting, perceptive comment.
In Scott's post, "resist the lure of political activism" occurs in a parodic context. It's not meant seriously, but as a reductio ad absurdum: talking this way is so obviously ridiculous that all Scott needs to do is to point out the similarity with how the media talks about EA, and the point is made.
In contrast, Yarvin, with his "clear pill", really does want to convince people to resist the lure of political activism. I think it's good to highlight that disagreement, since Scott has been accused of secretly supporting Yarvin.
I think Scott has been pretty vocal about thinking political activism is a high effort, low reward activity. Indeed, this is a premise of much of EA: otherwise they’d be all “help reduce suffering, vote Democrat/NRX/Ron Paul!” The rationalist sphere has been instrumental in pushing me out of political activism, if only by providing me a more humane framework to replace the totalizing lefty worldview.
While this might be big-picture true about the way EA is covered, the notable part about its coverage this time around (FTX situation) is that it’s been insanely positive/hands off. There have been opinion pieces and such from EA haters, but I found the establishment news to be remarkably sympathetic toward SBF and his goals to change the world for the better.
Separately, a few of these fake headlines make good points!
I think EA's are to some extent selectively seeing negative coverage (like performers who lose sleep over that one negative review). It seems to me that most popular news sites don't go into details about EA when covering FTX, or don't cover it too negatively if they do. It's true that for a certain subset of nerdy and judgy people who pay attention to EA and decided that they hate it, SBF is a godsend and a confirmation of everything they believed is wrong about it. But in my experience these are people who already made up their minds and even they will soon move along to expressing their condemnation of the next trending thing they hate. That said, I enjoyed the examples here!
This one was particularly satisfying. All of the gnashing of teeth about effective altruism is so poorly reasoned.
It seems like EA is taking a bigger beating in the media than SBF. I'm _honestly_ not sure why.
Well for one SBF is from the right sort of people for one. Rich elites, lefty politics, etc.
But most EAs are also rich elites with lefty politics, no?
His friends and parents are a little more plugged in than your random EA person. Not to mention the direct coverage political donations get you.
SBF also gave a lot of money to the mainstream media.
Brilliant.
Sure that's part of it but see articles like this one: https://newrepublic.com/article/168885/bankman-fried-effective-altruism-bunk
It's not just that they are all over the fact that a rich guy who turned out to be sleazy made a big deal of his EA giving (that's fair) but there seems to be an active desire to suggest that EA is somehow rotten because it took money from ppl who wouldn't have done good things with it and spent it doing good things.
I mean c'mon, the media isn't so naive as not to recognize that every bad person ever tries to associate themselves with charitable giving. Plenty of major international charities happily take money from the sleeziest ppl around without generating much negative comment.
But I think a better way to phrase my claim is this: bc ppl see EA (wrongly imo) as claiming a kind of higher moral status they then see this as a kind of karma (much like how religious orgs get held to higher standards than other orgs).
The orthogonality thesis in action: the media as highly capable agents, bringing considerable wherewithal to dystopian agendas built on horrifyingly confused conclusions.
(1) "Young people should resist the lure of political activism and stick to time-honored ways of making a difference, like staying in touch with their family and holding church picnics."
Isn't this the same approach as "your single vote doesn't count, stay at home and don't vote" argument which I've seen in the comments on this here site before?
(2) "Obviously this can only be because he’s using his photogenic happy family to “whitewash” his reputation and distract from Facebook’s complicity in spreading misinformation."
I've *seen* the Metaverse trailers, where Zuckerberg is taking a call from his wife (I think she's his wife?) sending him a video about cute stuff their dog is doing. They still fail to convince me that this is anything other than set up as a marketing ploy:
https://www.youtube.com/watch?v=b9vWShsmE20
Does this look like something a "photogenic happy family"-man would willingly use as his global image selling-point, or what an android thinks a human would pick?
https://www.indiewire.com/2022/08/mark-zuckerberg-metaverse-hollywood-1234753898/
(3) "If these tech bros would just read a book, they would learn that excessive concern about wife-kidnapping is dangerous and unnatural."
Well, pretty much everyone *did* end up dead.... and by one version of the legend, the real Helen was in Egypt all the time anyway.
> Well, pretty much everyone *did* end up dead....
Well, that’s unusual.
Odysseus, being smart, tried his best to avoid being drafted for this hopeless war. Achilles' mother dressed him up as a girl and sent him to live with the princess of another court among her maidens to keep him away. Many of them at the time knew getting pulled in was a bad idea, but due to the web of alliances and obligations, they had no choice about turning up to support Menelaus and Agamemnon.
Rather like the First World War, where the assassination of an Austrian arch-duke spiralled into global conflict because everyone was caught in a spider's web of connections.
I'm sympathetic to the concept of EA, and give some of my annual donations to GiveDirectly, who I believe is somewhat EA-aligned. I'm not familiar with the community, so maybe this question is ignorant.
Why was SBF embraced at all by the EA community? It seems most of his giving was political, and not particularly effective. He gave $20 million to a single congressional candidate in Oregon who got smoked in the primary and a couple hundred million to a Biden IE in 2020, where the marginal impact of a dollar is pretty damn low. lsn't this kind of giving the kind that will personally give the donor a lot of prestige, but is low on the measurable impacts EA embraces? (FWIW I do give money politically, but I see that as separate than pure charitable giving). Were there people in the EA community before SBF crashed saying, "this guy is not doing EA?"
The $20 million Oregon guy was Carrick Flynn, who formerly worked at the Future of Humanity Institute and was an EA really concerned about x-risk. SBF thought it would be good to have someone like that in Congress helping propose/pass x-risk related laws. I think everyone now agrees this was a bad idea, but I don't think it was insane at the time.
SBF also promised a few billion dollars to the Future Fund, a group led by EA philosophers that was going to try to donate it in the most EA way they could think of. I think they ended up distributing $200 million before FTX collapsed, which was about 50% of EA's funding during that time period.
In terms of him being "embraced by EA" - he used to be a staffer at the Center for Effective Altruism before he left to start his crypto company. So I think he started out part of the community and then became rich and we weren't going to expel him from the community just for being rich. Also, we were briefly getting 50% of our funding from him, and although I don't know if that counts as "embracing" him, I think it would have been dishonest to deny that this made him important and relevant.
I don't think it was an insane idea to support a political candidate, but I think politics, an area where people spend tons and tons of energy analyzing debating how things should be done, is the exact wrong place to try to bring in EA principles. Coming in saying, "I used math and figured out how to effectively do politics" seems very hubristic and probably wrong.
I personally think EA has a lot more to give in the realm of charitable giving, where there is very little thought given by most donors and very little critical press coverage. It seems in that realm there are huge gains to be made by asking simple questions about effectiveness, and I'm not sure why EA (speaking as a very uninformed outsider) has drifted from providing basic cost-benefit analysis on charitable giving to nebulous realms like electoral politics.
I don't think political donations are really considered an EA cause. SBF was donating a lot personally to politicians but aside from being an EA donor he was running a large crypto business that was heavily lobbying to enact favorable crypto regulations so he could operate in the US. I suspect that had quite a bit to do with his political donations.
It has drifted because the former is boring and not very sexy and doesn’t get you invited to fancy parties or treated like a big shot.
Everyone loves the dude using his money to pick out the next mayor, no one gives a fuck about the city compliance officer.
As context here, Martin Blank has no particular insight into either EA spending (of which the supermajority is still on direct global poverty or animal welfare) nor the arguments or the manner which political involvement was discussed. (Which has been to one particular political candidate and when not, specifically on the issue of pandemic preparedness). I guess they'd pick a mayor if they thought NIMBYism was an important local cause, but this action is definitely not in his world model.
Do not misinterpret his confidence and volume of posting for actual information about EAs.
(FYI Mike, as an EA I am very much with you on the less effective nature of politics, and most EAs would talk about how systemic political change is a false prophet. However they were seduced by https://www.overcomingbias.com/2007/05/policy_tugowar.html and thought that pandemic preparedness was different enough that the wastefulness of polarization doesn't apply, and that pandemic preparedness was sufficiently neglected and worthwhile that even long shots were good.)
You are taking the example way too literally. The point is x risk and aI risk and political campaigns are sexy. Being a clearinghouse for research on charity effectiveness is not.
And I don’t need to be an expert in what EA are doing because they talk about it constantly, so you can just listen to what they actually say! Wild eh?
Yes and how many EAs have you witnessed talking about global poverty and then switch to AI risk? I would estimate close to zero; because in fact the global poverty subsection of EA is distinct from the X risk subsection of EA! You're welcome to link to examples or name names, but I suspect you're mistaking the order in which you're hearing things as the direction EA thought has evolved.
The most respected organizations in EA are Givewell and 80000 hours, the former being a clearinghouse for charity research and the latter being career advice, with incidental advice on AI risk if that turns out to be the cause priority they are interested in. I believe this because most EAs give AMF and other givewell endorsed charities as the default. MIRI and other Xrisk organizations do not have nearly the mindshare that givewell does, and this has remained true. The fact that an Oxford philosopher has published a book on longtermism and had a large media campaign about it does not mean that Global Poverty EAs have vanished in a poof of logic.
The closest thing I can think of to support this PoV is https://forum.effectivealtruism.org/posts/83tEL2sHDTiWR6nwo/ea-survey-2020-cause-prioritization, but the survey finding higher EA engagement correlating to more AI risk engagement is entirely consistent with global poverty causes becoming more popular and the original core of EA containing a sub population concerned with high AI risk. To mention this without caveats or linking to evidence is bad form.
If you aren't an expert in EAs and are just listening to what they say, I believe it's proper to assume that you've properly vetted which of your statements are supported and which ones aren't and to distinguish between them. Since you haven't and haven't provided any basis for your blustering, boviating and bulverism about "what EAs really think" , especially when the facts on the ground about popular causes and amount of money moved contradict your narrative, you instead focus on one throwaway sentence not central to my point.
Forget altruism, I find this type of behavior disgusting because it's compromising plain old humanist values of honesty, engagement and thoughtfulness. When encountered with evidence that what you believe isn't the whole picture, you equivocate and claim that what they're doing doesn't end up mattering at all, in a thread about why has EA changed what they are doing.
As always, if you have evidence or have good reasons that you have not shown, I am wrong in my dismissal and my evaluation of your character and I will withdraw my previous statements if you are willing to provide reason or evidence.
Some EA causes such as pandemic prevention have a natural overlap with politics. An NGO can lobby, raise awareness or do research, but the actual work has to be done by the state as it is the one with the required medical, bureaucratic and regulatory infrastructure.
Also "doing politics more effectively than present-day politicians are doing it" doesn't seem like a hard bar to clear. And once you are doing cost-benefit analyses of altruistic causes, it's natural to start to worry about whether you are over-focusing on the kind of causes which are easy to analyze, which doesn't necessarily mean their cost-benefit ratio is better; and there is a lot of money in politics. The total yearly effective altruism budget is somewhere on the scale of $1B. The Build Back Better bill in its initial version would have spent about 3500 billion dollars to a wide array of causes. A few politicians committed to effective altruist causes might be able to move quite significant amounts of money as part of negotiations in an evenly split Senate. Even if you just look at cost-neutral common-sense improvements to some things that aren't really commonsensical currently, there are many opportunities.
If you want to get the government to spend money on pandemic preparedness, that's a niche enough issue you can probably get bipartisan support if you lobby correctly. There are hundreds of lobbies you can learn from, and get it pushed through the "secret Congress" Yglesias writes about. Spending $20 million on a single congressional race, and having it backfire when people question the source of the money, is not effective.
As for politics, there are huge debates about what is effective government. I am on the political left and would probably have huge disagreements with many people in this thread, while we'd likely all agree about malaria nets.
I donate politically and believe my donations are altruistic. But my donations are also my attempt to impose my ideas and values on the country. I personally think my ideas and values are right, but many disagree, hence political divisions. To some extent, politics is a game we play to compete for power and score wins for our tribe. For that reason I think politics is too complicated to apply EA metrics. There are entire academic disciplines - political science and economics - where people try to determine effectiveness in politics. It is not easy to determine.
Every person who ever saw a film produced by Harvey Weinstein is complicit.
omg amazing. I'm trying to control my giggles lest i have to explain
The only reason I enjoy a tiny bit of schadenfreude in all of this is that some—undoubtedly not all—seem to give off the impression that they think that before they came along a few years ago nobody else in history ever had the insight that charitable donations are best directed toward where they will do the most good. Rather, everyone else had uncritically donated money to the local opera house with no idea that there could be any issue there. If the same community had done more or less the same substantive work but—instead of making up a cute name for themselves and seeming to think they had invented a novel theory of charitable giving—had just said they were doing some work to help figure out really good recipients and to think through some of the issues related to charity, then they would have much more of my sympathy. It goes without saying that none of this justifies poor media criticism. But as others have noted there has been plenty of equally poorly reasoned articles supporting “effective altruism” and attacking other views. And it’s worth noting that the very hubris of designating themselves as some grand new movement undoubtedly contributed to the lack of media understanding.
Your use of the word "seeming" is doing an awful lot of work to make you comment not technically incorrect. Never the less, you are ascribing an positions to people that do not hold them which is illogical and unjust.
What would you characterize as the novel insight that warrants the invention of a special name like “effective altruism”? I thought it was more or less developing the idea that donations should be directed toward where they will do the maximum good, but I (seriously) stand ready to be corrected if there is something else there, as opposed to just doing good work helping to develop and apply principles that had long been recognized beforehand?
No novel insight is necessary to "warrant" a name like Effective Altruism. Groups can name themselves basically whatever they want without any additional justification.
They can absolutely choose to name themselves. But it does tend to convey the impression that they have some new foundational ideas that separates them from those who fall outside the group or preceded it. At least in my opinion.
You will find yourself perpetually confused if you adopt that mentality. Most groups don't have a new foundational idea, but get named all the same. Effective Altruism succinctly describes the groups goals. That alone makes it a better name that most.
P.S. Ascribing positions to people that do not hold them is still illogical and unjust even if you give the caveat that it is just your opinion.
What I was characterizing as my opinion was a view about what this conveys to reasonable people, not to me in particular, so it’s either right or wrong; either way your postscript is confused. But yes, in this case I do think that EA is ordinarily viewed as offering something conceptually new, and that the name, while not solely responsible for that impression, contributes to it.
I'll note that the following have been presented as clear and obvious reasons for why EA as a movement is psychotic:
1. Caring about someone that isn't your family means not caring about your family at all, making EA, meaning it is a self defeating / anti social philosophy.
2. Trying to do things that you don't personally do or see for yourself means it's impossible to evaluate effectiveness.
3. Existing status quo morality and actions like doing stuff for soup kitchens or helping the homeless is already maximally virtuous, and trying to do anything other than that is a distraction
4. Doing any sort of Monetary donation is mere cover for other nefarious activities.
Note that these arguments are being made in this very comments section, not to mention actual articles about EA. Have you not noticed them? If you have, why do you think those statements are consistent with the world where everyone obviously believes doing good better is what Charity is for, as opposed to signaling care, conscientiousness or loyalty?
I think these objections, most of which are equally applicable to utilitarian systems more generally, have been around a long time before a bunch of people decided to make up the name “effective altruism.” I personally think that some of these critiques have some merit to them, but I wouldn’t overstate them. But respectfully, I don’t view them as really novel to EA.
Is your problem that EA claims are not novel, therefore they don't get a new name? Would you apply this standard to the Abolitionists? Any number of new religious denominations with comparatively minor theological disagreements? The constitution, since most of the ideas are derived from enlightenment era philosophers, does not deserve to be considered separate?
Your standard, if taken at its word, implies that EA is not even special in terms of how derivative it is, and is an isolated demand for originality.
Abolitionists were a group of people who worked for the abolition of slavery. If a new group of people came up in the middle of the project and announced that they were a cool new movement called “Effective Abolitionism” then I would want to ask what novel ideas separated them from the broader movement. EA (the real EA, not the hypothetical abolitionist analogue) has received a lot of plaudits and cultivated an image of novelty and being on the cutting edge. If the defense is that they aren’t offering novel ideas after all then I would say they should have taken a less hubristic-seeming approach and done good charitable work without presenting themselves as a special vanguard.
If there is an existing network of people who uses broadly EA principles that already exists (using gold standard RCTs, trying to optimize charitable giving in some fashion, picking careers to do the most good) then I have not heard of them. What named movement are you referring to, and why haven't I heard of them despite wanting to find people who do not reflexitively demand you to tow the morality line?
We keep going back and forth between “no there’s nothing novel here and there needn’t be” and “yes they are novel; name me someone else doing this!” I was responding to the former argument. As to the second, there has been a long history of altruism with efforts to judge and assess the best recipients. This isn’t a new idea. It’s fantastic for a bunch of smart people to work on these issues and to place an emphasis on generosity. But I don’t see anything conceptually novel here. I’m open to argument, though, if you want to give me one. But then if the answer turns out to be “no there isn’t any conceptual novelty here but you don’t need that,” then I revert back to the argument in my prior comment.
This is epic, pleeeeease consider making it public (maybe in a few months, though)
I don’t think the human urge to pile on when someone screws up in a spectacular way is a very new thing. Nor the urge to find someone to blame when things go south.
The old saying, no good deed goes unpunished, comes to my mind.
It is interesting to me how so many people seem to of reached the conclusion that SBF was a ConMan through and through when the facts on the ground don’t even come close to supporting that. That might change, but right now the only clear thing is that he misdirected a whole bunch of money to cover up a gaping hole in the balance sheet somewhere else, and (I imagine) assumed that crypto would start going up again and he would be able to make everyone whole. How much damage has been done to a lot of people by the crash in crypto currencies that has nothing to do with FTX per se? In my opinion, comparing him to Bernie Madoff at this juncture in the investigation is really over the top. Bernie Madoff ran a Ponzi scheme for years with no intention of making anyone whole. What endgame was in his mind I will never know. Nothing has come out of this yet that puts SBF anywhere near that league.
I don't think SBF is being portrayed unfairly. Yes it's a different scam than Madoff, but the amount of scam is still superlative. Maybe SBF legitimately thought he could make people whole in the end, but there was so much book-cooking involved here that there's no innocent explanation. We don't know all the details--and likely won't for years as this gets disentangled--but we know enough to know it's bad.
Historically almost everyone who pulled something off like this had some plausible theory in their mind how they would “eventually make everyone whole”. Charles Ponzi included.
I am not sure that is much of a defense.
I am not sure that the original Charles Ponzi cared much about that, but I am no expert. And I am not defending him so much as applying my own sensibilities to the information I have available. So far it is more Ken Kesey and the Merry Pranksters then it is the James gang.
The original Charles ponzi for sure thought he was going to dig his way out. The guy was a lifetime grifter and loser, but he didn’t have some sort of business arbitrage idea he built it upon.
I suspect it was quite similar.
You could well be right. I looked up Ponzi and it didn’t settle anything for me. Did he think everyone would come out alright in the end, or did he know he was screwing them? Or does it matter?
In some corners of the media, it is being portrayed rather matter of factly. In other places, not so much. we will see how it unfolds, but I was just stating my opinion about what I perceive as a rush to judgment in some circles, and more than a little of, “I’m shocked to find gambling going on here, Rick.”
You are giving him a huge benefit of the doubt here. He seems to have behaved much more poorly than the best case scenario you are laying out here.
Anyone defending SBF almost certainly has the MSM rootkit malware module installed in their brain
I very much want to know what that is. Especially because it’s in my brain. It sounds rather exciting. .
Just get the NYT to cover you or something you care about, in a politically-chargered context where their politics incentive them to be misleading. The feedback loop shorts out the chip, causing first pain, and then a feeling of emptiness and loss, followed by a sense of isolation as you will then be cut off from the collective. After that, all you need to do is meditate for half an hour and you'll be able to pinpoint the location precisely.
I think I have experienced something like this.
It was an article in NYT making a case that Kodak et al were making racist, biased film stock because darker faces didn’t come out well .
We will see. I am open to a change of heart based on reasonable evidence.
Maybe not a conman, but if you read the bankruptcy filing, you wouldn't run a school charity sale the way he ran things (and I know, I've had to prepare paperwork for our auditors including details of the petty cash; if we have to account for a couple of thousand over a year, what the heck were the FTX auditors doing with regards to billions? If there even *were* auditors, which there may not have been, which is another problem):
https://www.documentcloud.org/documents/23310507-ftx-bankruptcy-filing-john-j-ray-iii
"I have over 40 years of legal and restructuring experience. I have been the Chief Restructuring Officer or Chief Executive Officer in several of the largest corporate failures in history. I have supervised situations involving allegations of criminal activity and malfeasance (Enron). I have supervised situations involving novel financial structures (Enron and Residential Capital) and cross-border asset recovery and maximization (Nortel and Overseas Shipholding). Nearly every situation in which I have been involved has been characterized by defects of some sort in internal controls, regulatory compliance, human resources and systems integrity.
Never in my career have I seen such a complete failure of corporate controls and such a complete absence of trustworthy financial information as occurred here. From compromised systems integrity and faulty regulatory oversight abroad, to the concentration of control in the hands of a very small group of inexperienced, unsophisticated and potentially compromised individuals, this situation is unprecedented.
...I have been provided with an unaudited consolidated balance sheet for the WRS Silo as of September 30, 2022, which is the latest balance sheet available. The balance sheet shows $1.36 billion in total assets as of that date. However, because this balance sheet was produced while the Debtors were controlled by Mr. Bankman-Fried, I do not have confidence in it, and the information therein may not be correct as of the date stated."
That last is a running refrain all through the filing: "I do not have confidence in [statement provided] and the information therein may not be correct as of the date stated." I would recommend reading the filing; it breaks down the structure of the entire house of cards and is entertaining on top of that, not something you can ordinarily say about financial documents, due to Mr. Ray the Third being hopping mad at the mess he has been landed with to clean up.
Yep, looks like auditors never went next, nigh or near this set-up:
"Alameda Research LLC prepared consolidated financial statements on a quarterly basis. To my knowledge, none of these financial statements have been audited. The September 30, 2022 balance sheet for the Alameda Silo shows $13.46 billion in total assets as of its date. However, because this balance sheet was unaudited and produced while the Debtors were controlled by Mr. Bankman-Fried, I do not have confidence in it and the information therein may not be correct as of the date stated.
...To my knowledge, Debtors Clifton Bay Investments, LLC and FTX Ventures Ltd. prepared financial statements on a quarterly basis. The September 30, 2022 balance sheet for Debtor Clifton Bay Investments LLC shows assets with a total value of $1.52 billion as of its date, and the September 30, 2022 balance sheet for FTX Ventures Ltd. shows assets with a total value of $493 million as of its date. To my knowledge, none of these financial statements have been audited. Because these balance sheets were unaudited and produced while the Debtors were controlled by Mr. Bankman-Fried, I do not have confidence in them, and the information therein may not be correct as of the date stated.
I have not been able to locate financial statements for Island Bay Ventures Inc."
Has Mr. Ray mentioned that he does not have confidence in the information? Because he doesn't, you know 😁:
"The FTX.com platform grew quickly since its launch to become one of the largest cryptocurrency exchanges in the world. Mr. Bankman-Fried claimed that, by the end of 2021, around $15 billion of assets were on the platform, which according to him handled approximately 10% of global volume for crypto trading at the time. Mr. Bankman-Fried also claimed that FTX.com, as of July 2022, had “millions” of registered users. These figures have not been verified by my team.
The Dotcom Silo’s unaudited consolidated balance sheet as of September 30, 2022 is the latest balance sheet that was provided to me with respect to the Dotcom Silo. It shows total assets of $2.25 billion as of September 30, 2022. Because such balance sheet was produced while the Debtors were controlled by Mr. Bankman-Fried, I do not have confidence in it, and the information therein may not be correct as of the date stated."
The lawyers are going to make *fortunes* out of disentangling this. One (headquarters) for everyone in the audience!:
"As mentioned above, Alameda Research LLC is organized in the State of Delaware. The other Debtors in the Alameda Silo are organized in Delaware, Korea, Japan, the British Virgin Islands, Antigua, Hong Kong, Singapore, the Seychelles, the Cayman Islands, the Bahamas, Australia, Panama, Turkey and Nigeria.
...On November 10, 2022, the Securities Commission of the Bahamas (the “SCB”) took action to freeze assets of non-Debtor FTX Digital Markets Ltd., a service provider to FTX Trading Ltd. and the employer of certain current and former executives and staff in the Bahamas. Mr. Brian Simms, K.C. was appointed as provisional liquidator of FTX Digital Markets Ltd. on a sealed record. The provisional liquidator for this Bahamas subsidiary has filed a chapter 15 petition seeking recognition of the provisional liquidation proceeding in the Bankruptcy Court for the Southern District of New York.
In addition, in the first hours of November 11, 2022 EST, the directors of non-Debtors FTX Express Pty Ltd and FTX Australia Pty Ltd., both Australian entities, appointed Messrs. Scott Langdon, John Mouawad and Rahul Goyal of KordaMentha Restructuring as voluntary administrators."
(1/2)
(2/2)
We're a registered charity, we are bound by a load of regulations, including the necessity for regular board meetings and records of same. If only we were a fancy-schmancy crypto outfit, we'd be able to get away with murder.
Governance and cash management were terrible to non-existent, and Mr. Ray is making sure Bankman-Fried and/or others can't access the bank accounts:
"Many of the companies in the FTX Group, especially those organized in Antigua and the Bahamas, did not have appropriate corporate governance. I understand that many entities, for example, never had board meetings.
The appointment of the [new, independent] Directors will provide the FTX Group with appropriate corporate governance for the first time.
...The Debtors have been in contact with banking institutions that they believe hold or may hold Debtor cash. These banking institutions have been instructed to freeze withdrawals and alerted not to accept instructions from Mr. Bankman-Fried or other signatories. Proper signature authority and reporting systems are expected to be arranged shortly.
Effective cash management also requires liquidity forecasting, which I understand was also generally absent from the FTX Group historically. The Debtors are putting in place the systems and processes necessary for Alvarez & Marsal to produce a reliable cash forecast as well as the cash reporting required for Monthly Operating Reports under the Bankruptcy Code."
Turns out *some* of the web of companies *did* have auditors - that were based out of the Metaverse 🤣
"The FTX Group received audit opinions on consolidated financial statements for two of the Silos – the WRS Silo and the Dotcom Silo – for the period ended December 31, 2021. The audit firm for the WRS Silo, Armanino LLP, was a firm with which I am professionally familiar. The audit firm for the Dotcom Silo was Prager Metis, a firm with which I am not familiar and whose website indicates that they are the “first-ever CPA firm to officially open its Metaverse headquarters in the metaverse platform Decentraland.
I have substantial concerns as to the information presented in these audited financial statements, especially with respect to the Dotcom Silo. As a practical matter, I do not believe it appropriate for stakeholders or the Court to rely on the audited financial statements as a reliable indication of the financial circumstances of these Silos.
The Debtors have not yet been able to locate any audited financial statements with respect to the Alameda Silo or the Ventures Silo.
The Debtors are locating and securing all available financial records but expect it will be some time before reliable historical financial statements can be prepared for the FTX Group with which I am comfortable as Chief Executive Officer. The Debtors do not have an accounting department and outsource this function."
"I have substantial concerns" - you don't say, John J.?
Did you work for FTX at any point in time? You never know, you might have done! They don't know themselves who exactly did or didn't work for them!
"The FTX Group’s approach to human resources combined employees of various entities and outside contractors, with unclear records and lines of responsibility. At this time, the Debtors have been unable to prepare a complete list of who worked for the FTX Group as of the Petition Date, or the terms of their employment. Repeated attempts to locate certain presumed employees to confirm their status have been unsuccessful to date."
Though credit where it's due, Mr. Ray does appreciate those who were doing their jobs as best they could:
"Nevertheless, there is a core team of dedicated employees at the FTX Group who have stayed focused on their jobs during this crisis and with whom I have established appropriate lines of authority and working relationships. The Debtors continue to review personnel issues but I expect, based on my experience and the nature of the Debtors’ business, that a large number of employees of the Debtors will need to continue to work for the Debtors for the foreseeable future in order to establish accountability, preserve value and maximize stakeholder recoveries after the departure of Mr. Bankman-Fried. As Chief Executive Officer, I am thankful for the extraordinary efforts of this group of employees, who despite difficult personal circumstances, have risen to the occasion and demonstrated their critical importance to the Debtors."
Getting back to "you did *what*???" territory:
"The Debtors did not have the type of disbursement controls that I believe are appropriate for a business enterprise. For example, employees of the FTX Group submitted payment requests through an on-line ‘chat’ platform where a disparate group of supervisors approved disbursements by responding with personalized emojis.
In the Bahamas, I understand that corporate funds of the FTX Group were used to purchase homes and other personal items for employees and advisors. I understand that there does not appear to be documentation for certain of these transactions as loans, and that certain real estate was recorded in the personal name of these employees and advisors on the records of the Bahamas."
The really serious allegations when it comes to who was handling the money and where it went (or didn't go):
"The FTX Group did not keep appropriate books and records, or security controls, with respect to its digital assets. Mr. Bankman-Fried and Mr. Wang controlled access to digital assets of the main businesses in the FTX Group (with the exception of LedgerX, regulated by the CFTC, and certain other regulated and/or licensed subsidiaries). Unacceptable management practices included the use of an unsecured group email account as the root user to access confidential private keys and critically sensitive data for the FTX Group companies around the world, the absence of daily reconciliation of positions on the blockchain, the use of software to conceal the misuse of customer funds, the secret exemption of Alameda from certain aspects of FTX.com’s auto-liquidation protocol, and the absence of independent governance as between Alameda (owned 90% by Mr. Bankman-Fried and 10% by Mr. Wang) and the Dotcom Silo (in which third parties had invested)."
Go read the whole thing, it'll explain why this is such a tangled mess.
This is interesting. Thank you.
It’s a bloody mess for sure.
Even if Bankman-Fried wasn't a fraudster, or didn't start out as a fraudster, the way he (and it really is majorly down to him) ran the concern was terribly careless, so it's not surprising they lost money because nobody really seems to have known what the hell was going on or who was doing what.
The Sequoia article is also a great look into what was going on. The writer was won over by Bankman-Fried's charisma (or whatever field of conviction he had going on), so he ends up giddy with praise, but the warning signs are there, in hindsight:
https://web.archive.org/web/20221027181005/https://www.sequoiacap.com/article/sam-bankman-fried-spotlight/
"The HQ building is distinguished by a reception desk in the microscopic lobby. The door is unlocked. There is no receptionist. I peek around the corner and into the FTX command center—29 desks in a room designed to hold 8, at most. Every desk touches two or three others. There are no aisles. To get across the room, you have to wade through (and, at times, climb over) a sea of office chairs. Walls of wide-screen monitors—two, four, even six per desk—stand in place of cubicle walls. The screens erupt like palm leaves from aluminum uprights and are oriented willy-nilly: up, down, sideways. Some screens are mounted so high they seem to hang down from the ceiling. It’s office environment as jungle, and the oddest thing about it is that no one seems to be home.
...At first blush, the scene is classic startup—the kitchen full of snacks and soda; the free catered breakfasts, lunches, and dinners; the company bathroom stocked with everything you’d need to actually live at the office: Q-tips, disposable razors, Kotex. In keeping with the fashion aesthetic of senior management, the dress code is marketing-swag-meets-utilitarian-merch: gift-bag T-shirts featuring the FTX logo, nylon athletic shorts, white-cotton gym socks.
But, as the week wears on, it’s the differences that start to stand out. FTX is not your ordinary startup. Most noticeable is the average age of the employees. Among senior management, SBF himself has just turned 30; Singh is 28; Arora is the old man of the group at 35. The company is also remarkably international. You hear the rapid-fire rhythm of Mandarin as often as English, but even that lingua franca comes in a wide variety of flavors—everything from a Bahamian lilt to the broken argot of ESL."
And here's a guy who is probably regretting his life-choices right now:
"Can Sun, FTX’s in-house legal counsel, tells me that his main job is to cement the many deals SBF makes on a handshake. Ninety-nine times out of a hundred, Sun says, the terms favor the other side. It’s another corporate policy derived from a rigorous logical argument: In an iterated prisoner’s dilemma, the best first move is always to cooperate. And, if the counterparty defects, “it’s better that I know this guy will screw me over now,” Sun says, “rather than later.”
I think the part in the following about "collegiate feel" is important; whatever about being young people in a young company, still they are all around thirty years of age, late twenties to early thirties, but they're still living/working in a setup like they're nineteen and in college. That's not how you handle a business dealing with hundreds of millions and ambitions to do even more. It's a fundamental lack of responsibility; yes they may be enjoying what they are doing, making their passion their work, but there comes a time when you have to grow up and do the boring routine adult stuff, like keeping a set of accounts and not be climbing over chairs in the main office to get to your desk:
"I clock in at FTX HQ at nine and clock out at five for most of the week—until, one day, I’m invited to live in what amounts to the FTX dorms. Many employees take advantage of subsidized corporate housing at a nearby development called Albany. The heart of the development is a yacht basin and marina surrounded by half a dozen residential towers. The area is so new that several towers are still under construction. FTX owns a passel of the multi-bedroom apartments in the towers and rents them out as crash pads to employees. There’s a collegiate feel to the whole setup. Indeed, Albany could be mistaken for an institution of higher learning: Behind the gatehouse is everything you could ever ask for on a campus—restaurants, cafés, a health club, golf and tennis facilities and, of course, classrooms."
The scale of ambition was definitely grandiose, maybe even delusional:
"To be clear, SBF is not talking about maximizing the total value of FTX—he’s talking about maximizing the total value of the universe. And his units are not dollars: In a kind of GDP for the universe, his units are the units of a utilitarian. He’s maximizing utils, units of happiness. And not just for every living soul, but also every soul—human and animal—that will ever live in the future. Maximizing the total happiness of the future—that’s SBF’s ultimate goal. FTX is just a means to that end.
...“So,” I summarize, “you are young and vital and peaking at precisely the point where the world is at, as you see it, peak crisis.” SBF nods in agreement, deep in another round of Storybook Brawl. “Does that strike you as just a lucky coincidence, or does that strike you as perhaps a signal that your thinking is flawed and you have a savior complex?”
“It’s an interesting question,” he says, stalling.
I double down: “You just happen to be alive in the most important time in the history of the future race. The existential point! Really?”
SBF hedges: “It certainly would not be one’s prior—at least, not naively.”
“Prior”—that’s a term of art. There’s more math to explain (in this case, Bayes’ theorem), but in the interest of you, dear reader, I will skip it.
“But,” SBF continues, “if you want to really needle on that, there are some anthropic considerations by which that might not be as crazy as it sounds.” With the mention of “anthropic” we’ve reached conversational escape velocity and head into the nosebleed regions of modern metaphysics. Again, I’ll spare you the trouble. Suffice it to say that, while SBF is willing to consider the idea that he might be delusional, as a kind of thought experiment, he ultimately dismisses it.
Game over.
After my interview with SBF, I was convinced: I was talking to a future trillionaire. Whatever mojo he worked on the partners at Sequoia—who fell for him after one Zoom—had worked on me, too. For me, it was simply a gut feeling. I’ve been talking to founders and doing deep dives into technology companies for decades. It’s been my entire professional life as a writer. And because of that experience, there must be a pattern-matching algorithm churning away somewhere in my subconscious. I don’t know how I know, I just do. SBF is a winner.
But that wasn’t even the main thing. There was something else I felt: something in my heart, not just my gut. After sitting ten feet from him for most of the week, studying him in the human musk of the startup grind and chatting in between beanbag naps, I couldn’t shake the feeling that this guy is actually as selfless as he claims to be.
So I find myself convinced that, if SBF can keep his wits about him in the years ahead, he’s going to slay—that, just as Alameda was a stepping stone to FTX, FTX will be to the super-app. Banking will be disrupted and transformed by crypto, just as media was transformed and disrupted by the web. Something of the sort must happen eventually, as the current system, with its layers upon layers of intermediaries, is antiquated and prone to crashing—the global financial crisis of 2008 was just the latest in a long line of failures that occurred because banks didn’t actually know what was on their balance sheets. Crypto is money that can audit itself, no accountant or bookkeeper needed, and thus a financial system with the blockchain built in can, in theory, cut out most of the financial middlemen, to the advantage of all. Of course, that’s the pitch of every crypto company out there. The FTX competitive advantage? Ethical behavior. SBF is a Peter Singer–inspired utilitarian in a sea of Robert Nozick–inspired libertarians. He’s an ethical maximalist in an industry that’s overwhelmingly populated with ethical minimalists. I’m a Nozick man myself, but I know who I’d rather trust my money with: SBF, hands-down. And if he does end up saving the world as a side effect of being my banker, all the better."
This is fascinating. Thank you. It really adds depth and detail..
I was struck by this;
> So I find myself convinced that, if SBF can keep his wits about him in the years ahead, he’s going to slay…<
If
> the way he (and it really is majorly down to him) ran the concern was terribly careless
You’ll get no argument from me on that one. This is not your grandfather’s corporate structure, to channel Ringo Starr. At the end of the day I am sure there are any number of laws they broke, but I don’t know much about the laws in the Bahamas re-corporate governance.
The big question is fraud, and it’s there that I hold my fire. If one’s opinion is that cryptocurrency itself is a fraud (an opinion I am open to) that doesn’t really point a finger at him on that score. Fraud requires intent. I don’t think his intent is very clear at the moment. Ponzi dealt with an asset that was universally considered sound -cash.
The hallmark of the eponymous scheme is that it can never make everyone whole, it must keep swimming or die, and when it dies someone is left holding the bag. Crypto as the underlying asset confounds this a bit.
I should read up on tulips; maybe that would shed some light.
Alright, what's your best price on making this post public?
I usually get about ten new subscriptions per subscriber-only post. If the average subscriber pays me $10/month and keeps subscribing for six months, that makes me $600. So I guess before writing this I would have wanted $600 to make it public, and I ought to stick to that even after I've written it to avoid feeling like I'm doing a bait-and-switch where subscribers regret subscribing to me because I just make posts public anyway.
I said this before but think about $5/€5 a month. That’s a cost I consider free. I Don’t think about it. I kept Apple TV going at 5€ even in months when I never watched an episode, just because I didn’t care about €5. Now that it’s 6.99 I am not going to keep paying every month - just when there is something to watch. This is not very logical I know but I discount €5 a month but multiply every thing else by 12, and think yearly.
Your free content is excellent and you want to reach the largest audience. That is noble. The extra gained from paying is therefore not that much content. I’d be happy to pay you for no extra content at a lower price.
Another thing maybe could be to allow paid subscribers one days access before the free articles are released.
For this to work, I would need twice as many people as are currently subscribed to be willing to subscribe at the lower price.
Although I know there are some people who would subscribe regardless of whether or not there's free content, I lose about 10% of subscribers per year naturally and I'm trying to figure out ways to convince more people to subscribe.
I subscribed because I saw something referenced somewhere (off-site) saying that your subscriber-only posts included the occasional ama.
I think that your please-subscribe pitch, such as it is, is not either particularly convincing or particularly descriptive. If you're serious about wanting more subscribers, I'd seriously consider hiring a marketing person of some sort, just as a contract job to help you write those one or two lines.
I subscribed for the measured epistemology of your writing, but this post just seems mean. I'm certainly in your tribe but, if I'm going to be an honest data point, much more of this persecuted-by-the-media thing and much I'm less likely to re-up.
Is there a way to say "I'm sad that people are using stupid arguments to try to discredit my friends" without sounding "persecuted"?
Or do you disagree that these arguments are stupid or that the media is using ones like them? For example, do you disagree that my first point is a fair satire of https://marginalrevolution.com/marginalrevolution/2022/11/a-simple-point-about-existential-risk.html ?
Whether it sounds persecuted or not isn't the main point. Maybe half of all things sound justified to half of all humanity.
The main point is how much value I've got-and-paid-for and evangelized from the regular epistemology of your writing. What I want is the high standard you've set up: empathetic, analytic, and urgent.
"Is there a way to say "I'm sad that people are using stupid arguments to try to discredit my friends" without sounding "persecuted"?"
Not really. I won't say it's the only way to handle it, but Hypothetical You are going to get criticised anyway for anything that is perceived as being sympathetic to FTX/Bankman-Fried (see all the commentary about the media going easy on him and 'this is probably because he's a Democrat/he gave tons of money to Democrat politicians/he's one of the rich in their circles').
So go full-on "Some of my friends are caught up in this and I'm sick and tired of them getting called names for shit they didn't do so I'm going to defend them and if you don't like that, tough! Door's over there!"
(Like I said: Catholic. Sex abuse scandal. Been there, got the jokes, oh yeah and the "the reason the Catholic Church is anti-abortion is so that there will be a plentiful supply of child victims for priests to rape" memes, whatever you say isn't going to appease everyone, so feck the begrudgers).
Eh, it's like on Reddit. If someone attacks your post and you zip your lip or give a mild response, the group stays open to your ideas and after a while you get some interested or supportive responses. But if you complain the sub downvotes you to hell.
As a counter point, I am happy to get tribal posts in addition to measured epistemology.
I missed the point of every single parodied article in Scott's post -- I guess because I avoid almost all the mainstream media. I do not think I have read a single article about EA in the last year, so I have not suffered through any instances of some colunist misrepresenting and mistreating EA. So while I have nothing against this post of Scott's being made more widely available, I actually think it's less accessible than a lot of his stuff, because you have to have a sense of how the media treats EA to appreciate it. And did I see somebody earlier suggesting he edit it to make it appropriate for younger readers? I'm not sure I did. But if somebody did -- I think very few teens are going to be up on what the media is saying about EA, and thence able to appreciate this post.
Nolan E.’s suggestion of allowing paid subscribers early access is, I think, a good one. “It’s all free eventually, but if you help subsidize by subscribing you get looped in quicker” sounds like something people would pay for. I won my current subscription as a prize in the book review contest, but will consider re-upping as needed. Just getting everything hot off the press would have some value, as opposed to free access the next day.
That would have an interesting effect on the comment section land rush, too.
Good point! Might be nice to have the most invested commenters get first crack at it, and then everyone else can chime in.
I think the way I would do this, if I were going to do it, would be to have a "subscription drive" post every January, where I say something like "Here are some of the members-only posts you missed out on last year - [link to some previously members-only posts that I'm opening up for everyone]. Here are some that you still can't read without a subscription [links to those]. Get a subscription today and you'll be able to read posts like this as they come out!"
Any thoughts on this plan?
Do commenters write things in locked posts that they wouldn't want exposed to Internet search engines? You've given us no guarantee of privacy, of course, and "put not your trust in princes" yadda yadda.
I know I've tried not to write stuff in locked posts that would be Bad if spread around the Internet, but I have also noticed myself being a little looser-tongued than in open posts. (Part, I suppose, is just that I assume commenters in here will be more charitable and less nuts than out there.)
It would incentivize you to make the members-only posts less topical than the regular ones, so they would remain interesting next January. Which would incentivize me, for one, to continue subscribing.
"Abraham Lincoln, Necromancer" was one of the first things I read by you - it had me literally laughing out loud, and thinking "damn, this guy's smart and also hilarious." I've been reading your stuff ever since and - although I take at least a passing interest in all of it - the more timeless bits are my favorite. By "timeless" I mean either "irreverent take on bizarre historical episodes" or "quirky, funny fiction with some serious philosophical implications" or "galaxy-brained metaphors and speculations about Big Ideas." All of which you do exceptionally well.
On the other hand, if that caused the free stuff to shift towards a primary focus on current events, prediction markets, AI safety, Bay Area rationalism, and psychiatric medications, it might reduce the blog's ability to appeal to an intellectually diverse crowd. Which might, in turn, reduce the number of potential subscribers.
So yeah - I don't really have a recommendation, but those are my thoughts, and I hope you succeed at figuring out the best approach.
I would have expected to you to price it on the basis on how much headache it would create for you if this were to be made public. Just doing the comment moderation on a public post would suck, I expect.
Sold. How should I pay?
This appears correct to me. You should post it public
The last one is right though. The fighting racism battle is way past the point of diminishing returns and people who worry about it would be better off working on issues internal to their communities.
How does this compare with the way the media shredded the Sackler family? I acknowledge there are differences -- super rich people lie about the addiction potential of their pharmaceuticals and give mountains of money to museums and opera houses and totally non-EA charities. Not only did the media go after them, The Law went after them with some success.
Some of this is just plain old resentment against rich people. Like Richard Cory, they glitter when they walk. They get worshipful press coverage for their good deeds. By contrast we're nothing, and we don't like being nothing. So when they fall, they get torn to shreds by the herd, we sickly and mediocre puddles of hatred. The rich, their charity and beneficence don't shine so brightly now! Turns out we needn't feel bad about ourselves!
That's really most of the story.
- An idealist group of mostly young white men have formed the Union movement. The claim that they want to improve working conditions for workers all over the world, but their methods have been controversial. Recent events have shown the catastrophic but predictable outcomes of the strategy called "strike-till-you-get-what-you-want" in Union circles. Famous philosophers have idealistically claimed that this method should theoretically be an effective way to improve working conditions and wages. But anyone with a common-sense understanding of real-world human psychology can predict that strikes will attract lazy people who do not want to work and who now can hide behind this fashionable cause. Worse, strikes may cause productive workers to become lazy due to the prolonged inaction. For every worker that strikes for improved condition, there will be ten who strike out of pure laziness. Instead of striking, maybe workers would be better of by encouraging a stoic mindset?
-Another issue for the Union movement is that some of the movement seem less concerned with simply improving working conditions, and more concerned with abstract ideas about perfect utopian societies. Inspired by philosophers, they predict that "socialism" will occur in the future, and that the Unions should prepare for it instead of focusing on improving working conditions now. While this may be an useful idea to think about, it is clear that these fringe theories are scaring of ordinary people who just care about wages and working conditions and who might otherwise want to join the Union movement. The Union movement would better achieve it's goals if the person in charge would ban discussions about "socialism". Those discussing socialism could just form a new movement and the person in charge could make sure that these two movements never overlap. That this hasn't happened already indicates that the Union movement is incompetent, and that it's more interested in theoretical speculation than real-world work conditions.
These echoes-of-Slate-Star-Scratchpad type subscriber posts are a nice cherry on top of the 5% paid content tithe, but for meta-consideration reasons I'm glad you're choosing to keep them paywalled. There's a time and place for half-serious half-trolling optically-spicy punching takes, and...God help me, I think Twitter works better for that than The Blog. Different product lines and all that, even if both are run by the same boss.
(It wouldn't be too much more work to polish this into a suitable-for-all-ages post, though. I think.)
"It wouldn't be too much more work to polish this into a suitable-for-all-ages post, though. I think."
What would this look like? I'm having trouble seeing the direction I would change it.