It seems like you chose a weird set of possible motivations. It seems much more likely that power simply supports the status quo, and it's much easier to say "whatever, just wait until the next election cycle" than "hey, we should overthrow this illegitimate government, which I am a part of"
The best a priori argument against the election being stolen is the absurdity of the freaking President of the United States, the most powerful man in the entire world, with enormous intelligence and law-enforcement resources at his fingertips, and a very wealthy man in his own right, (1) being caught by surprise, so that he could not prepare for it in the four long years he had in office, and (2) being unable to do anything about it but sputter and speechify after the fact. For both those things to be true Trump would have to be weirdly smart, so he could realize it was happening earlier and more certainly than anyone, but weirdly stupid, so he would fail to prepare for the possibility over his entire term in office.
I guess the second-best argument is that if someone is *going* to steal an election, they don't do it by razor-thin margins, for the obvious reason that this makes the outcome far more plausibly debatable. Elections *are* stolen in the Third World, and they used to be in the Communist bloc, all the time. But they are stolen by enormous, almost laughable margins. 98% voted for Dear Leader! What an awesome show of support! Et cetera. Nobody steals an election by 0.05% of the vote, that's like going into a bank and doing an armed robbery for $5 to buy a cup of coffee.
It's sort of a never-ending signals-and-detection arms race, true. No stable equilibrium in sight where The Ones Who Understand How To Read The Signs can finally lower their shoulders.
>And that suggests to me that the fact that there is a petition like that signed by climatologists on anthropogenic global warming suggests that this position is actually true.
That they're willing to sign this means that it's both politically correct and true. When the truth isn't politically correct, that's where you get hinting at things without quite coming out and saying them.
By "politically correct" I mean politically correct for the sphere in which these people are operating. By "true" I mean they're sure enough about it to put their professional reputation on the line by endorsing it.
'That they're willing to sign this means that it's both politically correct and true'
'By "true" I mean they're sure enough about it to put their professional reputation on the line by endorsing it.'
I'm not sure there's a difference. If something is politically correct, that means it has taken on the attribute of being True in some layer of the simulacrum. Something being "true" in base-level physical reality has very little bearing on the layer above it, where Truth is decided by "consensus".
I find it highly unlikely that any observations from reality could possibly change the decided establishment position on any issue.
It was a link to a very interesting Nature paper on long non-coding RNA, and some detailed work that suggested it had a *structural* as well as regulatory role to play in the epithelial-cell tight-junctions that keep capillaries from leaking. I did not keep a copy or link, but you can probably find it or related papers again by googling LASSIE (which is an acronym for this particular long non-coding RNA) and tight junctions.
Ivermectin bores me to tears, so not that of course.
I think the problem there is selectiveness. Invermectin might have some vaguely plausible method of action, but so would a thousand other drugs. The question is why are we examining this drug in the first place, and if it's not for a very good reason our prior probability of it being effective should be low
On occasions when I've encountered a complete Fox News presentation, rather than a link to a single article (e.g. waiting rooms with media tuned to Fox News) I've been struck especially by the selection of news stories, compared to their competitors, and especially to the BBC.
The world according to Fox is much scarier and more dangerous than the world according to CNN et al., which is itself much nastier than the world according to the BBC, without dipping into anything clearly labelled as opinion. Fox News goes deeper fishing for bad events to report, especially violent ones. And the BBC plainly puts a lot of effort into finding and publishing positive news stories.
Sometimes specific media have other quirks. My local paper presents a lot of stories with victims outside of the white-cis-Christian demographic, and stresses the demographics of those victims. It also likes to feature members of the latest victim's demographic community bemoaning increased or persistent targeting of their demographic. While it's conceivable that the proportions presented accurately portray the proportions occurring, I doubt it, simply because the victim demographic is only prominently presented when non-white, non-Christian or similar.
“ The world according to Fox is much scarier and more dangerous than the world according to CNN et al.”
Strongly depends on the topic. Listening to CNN every single American has died from COVID at least 3 times over. Well, at least those that weren’t killed by systemically racist gun violence first.
CNN has gotten a lot worse since Trump, and especially since COVID. Pre 2016, I’d probably agree with you.
My only exposure to Fox is through the clips that the YouTube algorithm pushes at me, and I have noticed exactly what you describe.
Furthermore, you can glean the respective political bias of both Fox and CNN just by reading the YouTube thumbnails…there is no need to actually watch the videos.
(Spoiler alert: the caption is always some variation of Our Guy - Good! / Your Guy - Bad!, with Your Guy - Bad! predominant
I suspect most of us who comment here on ACT have better than average BS detectors…I think I do, but have no idea how I developed it.
:prepares for comments section filled with people giving counterexamples for all the things Scott just said the media doesn't do:
I think part of the problem is that sufficiently advanced ability to find people who are lying and be unreasonably credulous to them is practically indistinguishable from lying - there's rules, but the rules have exploits so wide that one can reasonably call the game broken.
Haha, yes that’s what I was just doing in another post, running a cruise ship through those exploits. It almost calls for the humour of the original Charlie and the chocolate factory ‘tell me more about…’ head resting on fist meme, not to be mean, but because it fits and is always funny.
It is a game of inverted totalitarianism with examples of all behaviours which are there to fool you, sometimes they lie about, sometimes they make stuff up entirely, sometimes they twist things so far from base reality it may as well be made up. The media do all sorts of lying snd some of it is wink wink ‘business’ news that is just corporate counter intelligence or lies from the government where we pretend unemployment is under 5% when we have a workforce participation rate under 70%, etc. but there are other lies mixed in there and some buried story with a handful of truth is the media’s continual counter example, even if it is only 5% of their stories and never a front page headline.
Turns out they are talented liars and play against our well known strategies to spot the lies. Just like how con artists have an easy time stealing from over confident doctors. Knowing your mark’s weaknesses is essential and a smart con doesn’t mean you have to be smarter than your mark.
You thinking you know what’s going on…is part of their model of propaganda. We can sit and think to ourselves, advertising doesn’t work on me! I know it is a lie. Some of the broader propaganda model is the obvious wink wink nudge nudge game and that exists as a meta layer of lies to lull even the vigilant into complacency so they can slip in other lies through that loosely woven net.
They wouldn’t do that! Dan Rather is just such an upstanding guy and oh so square jawed! They wouldn’t make up stories with no basis in reality, getting actors is hard so they just insert footage of a totally different protest or riot. Which I’ve seen them do many many times with old riot footage from different cities and different countries even! How is doing that on purpose any different from paying actors to do it? Lazier, cheaper, and just as big a lie.
Was that teenage boy with a gun at that BLM protest turned riot an avowed racist doing racist things for racist reason as Biden and every left media outlet said pre trial as he gunned down innocent blacks people?
Do the core events of an armed teenager ‘defending’ a business owned by non whites as he was attacked by a convicted white pedophile matter? When a random group of 3 adult men attack a boy by running him down, saying they want to kill him, smashing him with a skateboard, and trying to take his gun from him…and he shoots and kills 2 of them in the base reality….How far from reality can you get before it just doesn’t even matter what reality was?
There are scare quotes around it, his stated purpose being there was to defend the property of his community (he worked there) as well as provide 1st aid.
He was clearly there in an attempt to provide a positive presence, certainly more positive than those protesting who caused billions in damage.
Who cares what he believed his stated purpose to be? What are you, a Kantian?
He was an untrained guy with a gun at a riot. That’s what police are for, because they’re actually trained in riot management. We don’t want vigilantes roaming the streets during riots, precisely because of the effect it evokes from irrational riotous criminals like those he killed. He was a negative presence at the protest, and it shouldn’t matter than the attempt to provide a positive presence. It should have been obvious to him that what he was doing was reckless and unhelpful, but he certainly doesn’t seem like the sharpest knife in the drawer.
I wouldn’t be so snarky if their style of argumentation weren’t completely at odds with rationalism. I’m not saying one has to be a fully robotic utilitarian, but to focus entirely on intent means one is biased.
Why? The media company's goal is to reenforce their consumers' worldview by feeding them bullshit; the goal of the media consumer -- maybe not for you, and definitely not for Scott, but the average consumer who is being targeted -- is to find someone who will feed them bullshit that reenforces their worldview. I don't see where there's any conflict of interest that would lead to an arms race.
I guess in exceptional situations, like when threatened by a global pandemic, your average reader might become more interested in knowing the actual truth than in having their pre-existing beliefs reenforced. In those specific cases, the situation might temporarily become very arms race-like. But I don't see any reason for that to be true in the general case.
It's certainly an instrumental goal at least, insofar as the terminal goal for the media company is to turn a profit. Whether it's a terminal goal is more debatable.
Curses! Foiled again. I was about to mention Sixty Minutes manufacturing false data. (Planting an explosive to make a certain car's fuel system look dangerous. )
That was Dateline NBC. Sixty Minutes got taken in by the fake memos, and kept on doubling down until management called in an outside investigator who said "wtf."
Yeah if handled a bit incorrectly it'd result in over prioritizing prior. Seems like an unstable balance (like free speech) while unfortunately, "distrust everything" is a stable balance (like full censorship), just like trapped prior taught us
I had the same thought. I kind of think of it as reading different kinds of graphs. At first, you might look at something and think it looks weird, then it's like "oh the scale is log10" or "x and y are counter to how I would have labeled them" or "the units are weird", etc, and once you know the lay of the land, you can decode the information and learn things.
A big part of the frustration of the moment is how much the rules of the game changed during the Trump presidency, at least for mainstream liberal media, and even for institutions. I thought I was reasonably familiar with the rules of the game, but things like the CDC statement on the BLM protests and the censorship of 'lab leak' theories caught me off guard – I would have previously said those weren't the kinds of lies to watch out for. It's been a difficult and frustrating adjustment.
This post, the attendant discussion, and others like it are valuable historical artifacts. Like the kids in the fairy tale dropping breadcrumbs, leaving a trail, these create markers. Later on we can read it and say “we were at that understanding in January 2022, and now it’s changed to (x).”
Nuclear/toxic exposure is a context where the “good harvest” approach has I think already been in effect. Even in more recent times with the military burn pit exposures - the agency saying there’s no problem, lined up versus thousands of sick people - in those contexts people assume the agency is lying, or a few individuals have prevented the agency from really looking, or even multiple careerist individuals have found it more beneficial not to look.
There’s an amazing research work called Wolves of Water. A guy in the UK was living with his family near a coastal site with some type of nuclear waste disposal in the water. His daughter developed cancer and he launched into a study of the situation and ended up correlating distance from the coast with cancer risk. During the Fukushima accident I wasn’t paying attention but I started noticing a few years later with the starfish die-off. I wound up on internet sites where people were posting their own atmospheric radiation data. A lot of it has been deleted now - the sites deleted - meaning, there went all that data.
The boundaries of what it’s acceptable to say about science have some less obvious frontiers. That’s one of them, the whole climate modification situation is another.
Something about COVID, it combined the “tell lies about toxic exposure” tendency within government, with a groundswell of millions of people needing to know the truth. Airborne things were usually radiation before, which “blows away” or dissipates or creates cancer rate spikes three years later when it’s almost impossible to really connect. Plausible deniability was baked in.
The conspiracy theorist gray area around toxic exposure is structured differently from what the post describes, I think. When it’s ideas, yes, it makes sense. When people are amassing competing medical data it gets harder to call it misunderstanding - even when there are plenty of places where reasonable people can disagree.
Epidemiologists found themselves saying things previously reserved for the nuclear folks via Covid, that’s one of the changes, I think.
This exactly. The media very much, and frankly very openly, tore up the rule book once Trump was elected. They tried to justify it by saying that essentially Trump broke the rules first, in a way that exploited the rules and made it impossible to report on him in a normal way. And I’m actually sympathetic to this! Trump DOES lie in a different, more bald faced sort of way than the average politician.
But the journos did not limit themselves to Trump’s bald faced lies, or even to Trump himself - now their favorite phrase is “said, without evidence” for whenever a GOPer says something they disagree with.
I don't know if I'm sympathetic or not... It seems they (media institutions) were faced with a crisis, and on the whole chose to meet it with the power of the Dark Side.
Bad Politics Man Made it Into Office will happen occasionally , regardless your political affiliation.
The fact that the media treated a *gasp* not-elite *gasp* conservative being elected as a DEFCON 1 event *is* the kind of institutional bias that conservative "conspiracy theorists" are going on about constantly. I think they are wrong on the factual matters, but when *the entire elite establishment* including journalists, researchers, etc have made it transparently obvious that they're in the tank for "whoever isn't Trump" - if you're a Trump supporter, you have been given no actual reason to believe that they're being honest when they said "yeah all these abnormalities in election reporting are normal and happen every year".
They are experts, and as such can craft expert lies vice normal lies.
This is a problem, because the message being sent by anyone with any sort of professional expert credential during the 2016 to 2020 timeframe was "Trump is the worst thing that has ever happened in the history of the United States and must be stopped at any cost". Trump supporters, in general - responded with "message received, you will stop at nothing to keep Trump and his politics out".
Yes, I totally agree. I came here to ask for advice on how to adjust better to these (now more common) types of lies. How could we have know that "masks don't work" meant "make sure masks are available for medical personnel"? What's the lesson to make ourselves better at finding the signal in the future?
I don't think this holds up to a close read of history. Media on both sides have covered many issues very poorly. Either because they lied or just weren't knowledgable enough on the subject to understand the truth. Its also a lot easier now to find criticism of all sides of the media as well as easier for topic experts to weigh in directly about topics that they know first hand.
Right - this is what a lot of liberals have missed. I went on a spiritual journey in Asia and tuned out the news from basically right after Trump got elected until Covid arrived. It took me a bit to figure out what was going on, but it ultimately became clear that the rules of the game radically changed during my three year absence. Lab leak and BLM narratives are good examples. We went from Chomsky's "Manufacturing Consent" paradigm, which is the paradigm Scott seems to still be working off here, to a new "Manufacturing Reality" paradigm, in which all bets are off. Yes, it is still possible to glean useful information from sources like mainstream academic journals and the New York Times. But one's level of skepticism has to be taken to another level, particularly in areas where there is a clear and established narrative. In those areas, you should expect to be, at the very least, misled. And, under the new rules, you may well be outright lied to.
That was what took me from the Scott's camp to the "there's no lower bound" camp. There were many other examples - including some things that come very close to what Scott pointed out the press wouldn't do, like once when they showed a foreign hospital with a lot of sick people to imply it is happening right now in the US, another time when they showed fire range in Kentucky and claimed it's a footage of an air raid in Syria (somehow they have to switch countries still - I guess there are *some* rules?)
Even more recently, I've read a history about Supreme Court judges where journalists claimed they said and did something, and all the participants came out and plainly said "we never said and did that!" and the journalists still were "well, we still think you did and you're lying to us because reasons".
Oh yes, and who doesn't know what gave birth to the "Let's go, Brandon" meme?
But for me the trigger moment where I arrived at the realization that there are no rules anymore - or at least there are no rules that I thought there were and none I could figure out. They will say literally anything or do literally anything if they think it'll serve whatever purpose they have. And the number of people among "experts" who are willing to stand up to this is extremely small. The number of people knowing the lies are lies still large, but most of them either don't have voice, or don't want the Eye of Sauron to focus on them for raising it.
The world became much scarier that day. And it's still pretty scary.
Yeah, right now we have basically two kinds of media - one is nicely controlled, polished, has a narrative and will say anything to drive an agenda they are currently driving, and for it you being informed is actually a negative - they want you to arrive at and be secured in an opinion they prescribe, and that's their only goal. They would gladly lie, suppress or distort information, if they think it serves their goals, and they feel zero loyalty to their consumers - which they see as a raw material, not clients.
The other one is actually doing the function that the media is supposed to do - disseminate information about the current events, but they have virtually no quality controls and only very weak and rudimentary reputation mechanisms, so the quality of the information varies wildly. They would never suppress or distort what they think is the truth, but what they say could be true, or it could be figment of somebody's wild imagination.
Somehow one has to maintain sanity and be a responsible and informed citizen in this environment. It's not easy and it's not going to get easier anytime soon.
"But for me the trigger moment where I arrived at the realization that there are no rules anymore - or at least there are no rules that I thought there were and none I could figure out."
Sorry, what was the trigger moment? Apologies if I'm just selectively blind here, but you seem to have left the actual moment out.
Eh... It was more like if "BLM protest lead to more socioeconomic equity for Black people then they are on net positive for public health". It was obvious Bs at the time but it wasn't an outright lie.
In other words, it asserted that people dying was an acceptable trade-off for people of a certain heredity having it better, while people of differing heredity were being told that their living normally was verboten because more people would die than if it wasn't.
That's not 'BS' though. Whoever has the power to define "public health" can assert such things; whoever has the power to implement policies under the rubric of "public health" has the authority to carry them out.
"Public health" - it's a funny old bird. A little thought will reveal that no gestalt emerges when a great many people's individual healths are assembled. Some people will be poorly, others hale and hearty, and all are wholly unaffected by whatever someone else's summing up of the delusive 'overall picture' might be, until such time as it begins to inform "public policy".
Is mild malnutrition for all an instance of better or worse "public health" than some being well-nourished and some others severely malnourished? Must a doctor be deputised by the public to pronounce on the matter? Do they get to ask for a second opinion?
There is thus no such 'thing' as "public health" in the sense of the notion of something good and objective which the words conjure up, but it is a powerful conjuration, and we have likely not yet seen the greatest works to be done in association with the incantation's utterance.
So many replies, and not a single one mentions that there was no "CDC statement on the BLM protests." A simple Google search reaffirms that I remembered this correctly. In fact, the *only thing* the CDC director said about the protests is that they probably spread COVID, and everyone involved should get tested. That was the director, speaking before Congress; the CDC never made any official statement as an organization.
I’m sure he was referring to the open letter signed by a bunch of self-professed public health experts, which included at least one who claimed to work at the CDC. There were also some tweets from a former CDC head to similar effect. You’re correct that the above commenter has the facts wrong. To the extent that the open letter was co-signed by actual public health experts, which is how it was presented to the public, it made those experts look pretty bad.
This sounds like an isolated demand for rigor. You are picking on a small inaccuracy with words to deflect from the main point which you aren't honestly arguing against.
Yes it's inaccurate to call it a CDC statement. But there was a letter co singned by "public health experts". Now maybe those weren't real experts, or the mainstream consensus was against them? If that's your argument, then make it explicitly.
As I remember it, all sides took the letter at face value as an expression of what mainstream epidemiologists wanted to express publicly. Maybe many disagreed quietly, but I don't recall prominent officials or institutions coming out and saying anything.
I don't think it's a small inaccuracy. If the CDC made that statement, we'd be living in a very different universe.
What actually happened was very much in line with Scott's post. A lot of individual experts beclowned themselves. The media was a bit over-eager to report on this. For political reasons, a lot of people who knew better remained quiet.
You can argue that they should have provided alternative views, and the failure to do so indicates bias on CNN's part. I would agree with this. However:
1) At no point is this explicitly stated to be a consensus view. In fact, the letter itself--as quoted in the third paragraph of the article--claims to have been created "in response to emerging narratives that seemed to malign demonstrations as risky for the public health". An astute reader, reading between the lines, would take this as an admission that the letter does not express a consensus view.
2) The article provides actual numbers. 1,200 sounds like a lot, but is really just a tiny portion of the millions of health experts in the US. Many of these 1,200 come from a single university--which is noted early on in the article.
3) The very first words of the article are "A group of health and medical colleagues..." A reader with bounded distrust would notice that no major organization gave their blessing to this letter--only a bunch of individual people.
4) The letter itself doesn't actually contain any lies. Just terrible opinions. (In fact, the content of the letter is even worse than I remembered. I would not trust any individual doctor who singed it. Luckily, I'm unlikely to ever interact with these 1,200 people who mostly live in a different part of the country.)
The article doesn't make this clear, but many signatories were not really health experts--some were even students. Yes, the media should have reported this. And, yes, more people should have spoken out. But these are all dynamics Scott mentions in the article. They're all well within "bounded distrust."
Whereas the person I responded to claimed that the rules had completely changed. Which might be true, if the CDC made this statement. But that's not what happened.
I actually agree that institutions generally became less trustworthy over the last 5 years. But I don't think we're in some totally new paradigm.
Thanks for laying this out. "There is no lower bound" type comments are admitting ignorance - which is fine - but some are trying to dress it up as a new knowledge.
Based on the first section, this fatally fails to distinguish between Fox commentary (i.e., Tucker Carlson, Hannity) and Fox NEWS. They are different. Indeed, that's how Tucker beat one lawsuit, by arguing that no reasonable person who see what he does as news.
Yeah it's a cake and eat it too situation. And even though Fox kinda started that model of "news opinion as opinion news", the success of their business guaranteed it would go on to infect virtually everything.
I think it's just "'eating the cake" situation. I mean if you come to a place called "cake eatery", you expect people to eat cake there. And if you come to a place called "Tucker Carlson Tonight", you'd expect that tonight there would be a guy named Tucker Carlson telling how he sees things. I think it's pretty hard to get more "what it says on the can" than it is?
If you make specific false (and defamatory/libellous) factual claims as part of an argument, the fact that other parts of the argument were opinion does not (in my view) make the lies OK. Nor does the fact that a careful observer could figure out that you're a habitual liar, not if your core audience believes the factual claims you're making are true.
I'm not sure what's your point here. Were you trying to impress on me that lying is not OK? I know, but why did you feel the need to explain it to me? I certainly didn't tell anything that may suggest otherwise.
What everybody believes is their personal business, and I am not sure I can make any claims about Carlson's audience beliefs specifically, except to note that it's highly unlikely they'd be in the audience for long if they thought what he says is usually false. Of course, if you do think it is the case, you can always withdraw yourself from that audience.
The point of my comment was, however, that it is strange to imply Carlson is pretending to do something he is not doing when he's running an opinion show specifically marked with his personal brand. What value you attribute to that brand is entirely up to you - but it is what it is, the opinion of a guy named Tucker Carlson, no more, no less. It doesn't make him more right or wrong than anybody else, it just makes claims that he's pretending to do something he's not doing unjustified. If he claims X is a fact and turns out X is not - he's still wrong. That can happen to the best of us - for example, we just witnessed several Supreme Court members openly proclaim wildly fictitious statements directly relevant both to the facts and the law of the case they were deciding. Sad, but that's the world we're living in. At least with Tucker you know what you're getting, and if you don't like it - you can easily stop getting it.
>for example, we just witnessed several Supreme Court members openly proclaim wildly fictitious statements directly relevant both to the facts and the law of the case they were deciding.
See Vox, for that matter. Somewhere in the SSCsphere is a passage complaining about how Vox will hide commentary among its "voxsplainers", where it's hard to notice unless you're paying special attention.
Except the people who accuse fox news of bias are talking about the actual news too. Everybody knows Carlson is biased, he's an opinion giver. Nobody thinks he "unbiased" in the way news ought to be. They claim that the news presented by Fox news is itself unreliable.
What I thought about the case where Fox is showing the police news conference with a suspect named Abdullah is that I would worry that Fox is showing a news conference with a suspect for an unrelated crime. They won’t make up a news conference, but they will show an unrelated one as if it’s related until authorities explicitly say it isn’t.
Leaving aside multiple cases where media of all stripes have used bad art to illustrate current news, do you have examples of where Fox has presented a conference as addressing one issue when it was about a different one?
The first example that comes to mind is the series of articles about Hillary Clinton’s emails, all presented as if they contained new information, all of which turned out to be about the same emails that had been discussed for months. I think a lot of the Trump Russia stuff was like that too - something about some Russian activity is juxtaposed with some Trump statement to make it look like they were connected, even though there is no specific allegation of connection they are making.
Regurgitating a slightly repackaged old piece of news to keep it in the news cycle is a different thing than showing a news conference for one event and claiming it’s another.
Yeah, I wasn't imagining them specifically *claiming* it's another - I was imagining a situation where there's five suspects and the police holding a conference about all of them but they just show the one, or a situation where it's unclear whether the conference is about a suspect in this case or some other case and them juxtaposing it with this case in a way that makes it seem relevant, or any of a million other similar things.
> Except the people who accuse fox news of bias are talking about the actual news too.
If I had a nickel for every time someone followed up "Paper XYZ said [outrageous thing]" with a link to something clearly marked as an opinion piece, I'd be a wealthy man. There are claims to be made of bias in actual reporting, but an awful lot of the volume is clearly coming from folks who fail to make the distinction.
> Except the people who accuse fox news of bias are talking about the actual news too.
I feel like this isn't obviously true! I mean, of course there's people who think all sorts of things, but I'm very often hearing people distinguish between the news side and the opinion side.
One problem, which Fox and MSNBC and CNN have created for themselves, is muddying the distinction between their news and opinion sides. They will have Hannity or Maddow anchor election coverage and give news updates, while also opining wildly, making it harder to parse for a person without a background in news. I don’t think Fox was the first mover on this front, but they are really bad about it, so I understand why people find it so easy to denounce them with blanket statements.
I could be wrong, but I don't think MSNBC has a news side, it's all opinion. Or rather, NBC's news side is just "NBC News", without the "MSNBC" branding.
It’s fuzzy, but there is spillover between MSNBC and NBC, with Lester Holt working at MSNBC first and now anchoring NBC nightly news. NBC News has a lot of arms, and MSNBC is the most clearly editorial, but it’s also marketed as a news network.
This may not be the current status but I hinge my opinion on Fox News coverage on a study rating the ratio of positive to negative news articles about McCain and Obama in their presidential race
CNN et al were in the area of 80% positive Obama stories and 20% positive McCain while Fox had a narrow spread, about 10 points. It was something like 55% positive McCain and 45% positive Obama stories
Much of the 'fake news; THEY are lying' crazy occurs because somehow, we have all apparently abandoned any difference between facts and opinions. "A happened" (such as a mass shooting the FOX example) is an observable, testable fact. "Therefore B should happen/become law/be done" is an *opinion* that we might agree with or disagree with.
Your point about the difference between Fox NEWS and commentary is really IMHO a broader point to much of news, authority, or social discourse.
"I believe that in some sense, the academic establishment will work to cover up facts that go against their political leanings. But the experts in the field won't lie directly. They don't go on TV and say "The science has spoken, and there is strong evidence that immigrants in Sweden don't commit more violent crime than natives"."
I came to say this exact same thing. I feel Scott is taking the wrong lesson from the current media landscape. He has assumed the old rules still hold sway. But the rules are changing under his feet and his confidence is sorely misplaced.
That letter condemns conspiracy theories and very carefully avoids saying that the virus could not have found its way to humans through research activity. I think it's a pretty good example of the kind of not-quite-lying that experts do all the time.
My point isn't that it's not bad. My point is that you should disbelieve letters like that.
[edited to add]: To be clear, I agree it's really bad for a lot of reasons. The fact that it is political speech is itself bad. I just don't think it's a counterexample to Scott's argument.
I think I agree, but I think there's also an overton window on science communication, and stuff like the letter(tm) pushes everything in an ungood direction. I keep waiting for someone to finally come out and say "Look, it was a crisis moment, there was plenty of reasonable doubt in both directions, and we couldn't afford to have talking heads on the 5 o'clock news impugning the country that not only had the most data on the virus, but also makes a substantial chunk of our meds and ppe based on scant speculation" That to me is a perfectly reasonable defense, if someone would have the guts to say it.
I think one correct takeaway from the letter is that there probably is (or was at the time) some real genetic evidence that rules out or pushes back strongly against at least some varieties of non-natural origin. Another takeaway is that the question of the virus's origin is a politically-charged topic, and that the scientific community is probably going to be pretty biased in how they approach it. When a letter talks about "standing in solidarity" and "fighting disinformation" and only one sentence out of 18 makes a scientific claim, you should assume that it is mostly unreliable.
The elephant in the room is the censorship practiced by the big social media platforms, which spread to a lot of the "blogosphere". So opinions and evidence that ran counter to the guidance from the CDC and WHO was suppressed as "misinformation". The party line was / is that Hydroxychloroquine and Ivermectin were not only ineffective, but HCQ was dangerous due to a chance for heart issues. Yet there is a study from 2005 published on Pub.Med (an NIH website) that concludes that HCQ is effective against SARS viruses when given early and is well tolerated. The approach of Uttar Pradesh, where teams actively sought positive Covid cases and provided a kit that included Ivermectin and other palliatives, appears to have been quite successful.
The biggest red flag for me, besides the suppression of dialog, is the interference with the doctor / patient relationship. HCQ and Ivermectin, for example, are widely used with little adverse reaction. From the beginning there has been anecdotal evidence from doctors that patients who take HCQ for their autoimmune problems have handled Covid quite well. There is no evidence I've seen that HCQ or Ivermectin cause problems for Covid patients. So what is the justification for the reported suspension of doctors for prescribing them off label?
Which leads to the third major problem - the sloppy statistics with poorly documented rules for collection and the lack of granularity. Just today I saw where the stats from Hamburg were grossly wrong in asserting that the majority of recent cases were from the unvaccinated. The handling of Covid 19 by the medical and political establishment has been a hot mess.
> some real genetic evidence that rules out or pushes back strongly against at least some varieties of non-natural origin
That is much more narrow than what was claimed, and a team of scientists writing in a scientific journal know the difference.
> When a letter talks about "standing in solidarity" and "fighting disinformation" and only one sentence out of 18 makes a scientific claim, you should assume that it is mostly unreliable.
This is a strange way of reading that Lancet letter.
- It's in a scientific publication
- Written by scientists
- The only scientific claim made is that "this coronavirus originated in wildlife"
- They pepper that claim with a bunch of citations and erroneously state that those papers "overwhelmingly" support their one scientific claim
- The result of the letter is to push discussion of a legitimate line of scientific inquiry out of serious consideration by both scientists and the general public
Yes, there are the political statements in there. But there was a concrete scientific claim that appeared to non-experts that it was well-supported in the scientific literature. That lie was designed to be persuasive precisely BECAUSE it was made as scientific claim, not because it was an appeal to authority by scientists. That distinction crosses all the lines Scott lays out above.
I get pretty angry when someone does something that obviously serves no purpose except to deceive or confuse, and then it gets defended on the grounds that "everyone" sees through it so it's not really a lie.
Like, there's a price tag that says $9.99, and then someone tries to pay $9 for it, and the clerk scornfully explains that it costs $10, what's wrong with you? The store is engaged in deliberately-engineered psychological warfare to confuse their customers! It's not *especially* effective psychological warfare; most people manage to figure out the actual price (eventually); but it's effective enough that stores are measurably making money from it. Getting angry when a customer is confused by the thing that YOU did with THE EXPLICIT GOAL OF CONFUSING THEM is like beating someone and then complaining that they bled on you.
If the REASON you refer to your tax increases as "revenue enhancements" is that it makes people get less upset about them, then it is obviously a lie, and it obviously matters--otherwise it wouldn't work! Claiming that it's not a lie, or that no one is fooled, or that no one who matters is fooled, is just an attempt to escape responsibility for telling lies.
There are situations where it's legitimately OK to say untrue things because you aren't INTENDING to fool anyone--jokes, sarcasm, fiction, etc. But if the whole point is to profit by impeding your audience's understanding, then this defense is not even slightly available to you.
True, very good point on the 9.99 thing. I never got caught by those cheap tricks, convert them so automatically it does not even bother me. But my GF is often caught, and you just made me realise I should not be mildly annoyed at her for that, but mildly annoyed at the supermarkets...
I think the primary harm is not from people forming inaccurate conscious beliefs about the price, but people having a split-second gut reaction to the first digit before they've even finished reading the whole number.
I suspect a lot of people who believe they are "never caught" by this are nonetheless being influenced to be statistically more likely to buy the thing compared to a counterfactual where the price tag said $10.
But I believe the subconscious nudging is on a continuum with the people who actually try to hand the cashier $9. The $9 people are just the ones where the trick worked better than the store would have preferred.
Even if it truly doesn't work on you, I think you should be upset that they tried. When manipulations work, you often don't notice; if you catch someone in a failed attempt, you should punish them as a deterrent against trying. (Compare: punishing an attempted pick-pocket who didn't manage to get your wallet.)
Unfortunately, many casually-manipulative business practices are so common in our culture that you can't find a competitor who doesn't do them. I don't feel our culture is sufficiently upset about this.
I don't recall seeing any "$X.99" prices in the (Australian) supermarkets for a couple of years. Not sure exactly why they stopped, but they seemingly did.
There's price discrimination via hide-the-cheap-brand, though.
You can also consider it as part of the social lubricant that allows a society of individuals with an enormous range of personal interests to coexist peacefully. In a large number of social transactions, probably most of them, the value received is not exactly balanced -- could hardly *be* exactlyt balanced. A certain amount of genteel obfuscation allows the relative gainer to appear gracious and the relative loser to save some face.
Id est, to take your simple example, when I buy a gallon of gas the cost of the gas is in the present but the value for me is in the future, so I'm a little grumbly about the transaction. Jesus! $70 to fill the tank! Grrr. Putting the price at $4.99 a gallon allows me a tiny bit of psychological self-delusion that I'm paying about $4/gallon instead of about $5/gallon. I *know* the truth if I think about it even for a second, and I've reconciled myself to its necessity, but the "5" is not staring me in the face the whole time I'm at the gas pump, so it's less annoying. The oil company is thus doing itself and me a slight favor by obfuscating the true price very slightly, so that it's less in my face and the transaction takes place with less annoyance.
We do this all through language. It's why the caring physican speaks of your mother's "passing" instead of using the brutal non-euphemistic word "death." It's why the teacher says you "aren't getting a passing grade" instead of the more brutal "you're failing." It's why that girl said "I think we should see other people" instead of "you're boring and I would rather cut my throat than commit to you." These things can be looked upon as "lies" because they are not unvarnished truth, but they also allow us psychological space to accommodate ourselves to some harsh realities. They are a very necessary part of how a species like ours gets along without incessant fighting over small (but painful) gains and losses.
True. Maybe worth to mention that overdoing it makes it a sarcasm, which is worse than brutal truth, it's rubbing in your face that not you are not only the looser in this transaction, but that the winner do not even fear you just a little and make fun of your helplessness. You are not the looser of the transaction, you are a looser.
Like the health minister in my country. I always though that he had a half-smile when he announced the new restrictions, contact limitations or lockdowns, punctuated by "I know it's hard" or equivalent....
Sure, people can be assholes, and power corrupts. I'm just pointing out Chesterton's gate here. Ambiguity in language exists not because generations of humans are too stupid to think up precise terms, or because they are always trying to con each other, but also because we use those ambiguities to help ease social tensions that would otherwise have us at each other's throats more than we are.
I don't disagree with you really, but I want to point out that the clerk isn't the party who is carrying out psychological warfare. They probably didn't even place the price sticker. I think they're justifiably irritated in this case. They're not trying to trick anyone.
If the corporate overlords showed up irritated at the customer, it'd be a different story.
Clerks get berated for a lot of stuff that is not their fault.
It's highly likely that a given supermarket worker has placed at least a few price tags; my understanding is that restocking shelves vs. cashier are more of an as-needed substitution than different job titles.
More generally, you're getting at one of the basic functions of bureaucracy i.e. to conceal the guilty party both in the physical and informational sense from the aggrieved party. "Throw your hands up in defeat" is one response to that, yes, but it's obviously not perfect given that it's literally letting the bad guys win.
Well what else? If as an organism I am not 100% all the time maximizing my personal welfare (or at least that of my genetically or memetically related tribe) then my DNA is nonoptimal and my germ line will be replaced by another that isn't. Or rather, it would already have been replaced a million years ago, so you're only going to find an individual *not* acting that way if they are some weird sport mutant.
Right, you'd need everyone to be completely fair and neutral and then someone else enforcing that.
I apply one filter to politicians and advertisers, where it's generally "I can't prove them wrong in a court of law."
I don't have much choice in politicians and advertisers, but I can choose my news media I listen to. If I get the wrong impression (after applying filter), they're wrong.
I don't think that this reductionist view that all our societal interactions are this strictly determined by genetics is correct. Genetics are certainly a large influence on behaviour, but a) genetics do not get tuned to maximise reproductive success but merely avoid too-serious reproductive failure, and b) our primary means of maintaining social structures is environmental, not genetic. Most humans are certainly capable and often do take short- and long-term actions that are not genetically optimal.
I haven't personally known anyone to do this. I've read some allegedly-true stories mocking customers who made errors due to common deliberately-confusing sales tactics that worked "too well", but I don't recall which tactic(s) specifically, so I picked an example on the basis of how succinctly it could be explained.
The distinction is between "deception" and "lying".
"Everyone" agrees that "$9.99" is deceptive and that it attempts to gain money in a zero-sum fashion (i.e. extract it from the customer).
Most people, including me and seemingly Scott, agree that it's scummy behaviour (most of the rest are scum).
Where people disagree with you is on the use of the *actual word* "lie". It seems useful to be able to distinguish various forms of deception from each other, and "made literally-false statements" is a category most people feel should have its own word. The apparent consensus for the word to use for that category is "lie".
In that specific sense of "lie", it is not a lie. The price says "9.99", and you can pay $9.99 at the checkout. It is literally true.
I'm not defending it or anything; it's bad. It's just a different sort of bad than "lying".
I think the comment I replied to above is pretty clearly using "lie" to mean something other than "make literally false statements".
Also, "lie" is not a term of art with a precise technical meaning, but I don't think your definition matches common usage, either.
If you say something you believe to be true, but you turn out to be wrong, I don't think most people would call that a "lie".
I don't think most people consider metaphors or idiomatic phrases to be "lies", even though they usually aren't literally true.
Conversely, in my life I've heard many phrases like "lie by omission", "lie with your actions", "the truth is the best lie", etc., implying that "lying" does not require literal falsehood.
My overall impression is that, in common usage, "lie" means something much closer to "attempt to mislead" than it does to "make literally false statements."
Setting the semantics aside: It's my impression that people have strong instincts about sticking to the literal truth when they are SPEAKING (in a context where they might be accused of lying), but LISTENERS have no such instincts and basically only care about intent. I suspect the literal truth standard was evolved as a defense against accusations, not as a behavioral norm.
In the specific case trebuchet went a bit too far, but your reply was extremely general.
PolitiFact generally makes the distinction and awards something like "Half True" to "literally true but misleading".
There's also the information-theoretic way of looking at things: you can actually rule out a lot of possible worlds from "tells the truth misleadingly" if you are sufficiently careful, but information from a known liar doesn't rule out any.
Believing that Covid was caused by a lab leak _is_ a conspiracy theory. Thinking that there is some, entirely debatable percent chance that it was caused by a lab leak is not.
I have yet to hear a single person say that there is 0% chance that Covid was caused by a lab leak. Yet, over and over again, I hear the type of people Scott describes in the article above insisting that it was "100% a lab leak.
That's because human thinking (and language, downstream from that) isn't well suited to explicitly deal with probabilities. Consider the very words "true" and "false" which naively seem to imply either 100% or 0% probability for some proposition, which is clearly unrealistic in most cases these words are deployed. What generally happens is that the most likely seeming hypothesis gets to be called "true", and everything else is "false" with increasing degrees of indignation/ridicule. When the previously "true" thing happens to lose its provisional status, this tends to generate much cognitive dissonance in the epistemologically unsophisticated.
In most context, believing means true with a large probability. >75% >90%? Not sure, and it's not often important...
The context where believing in fact means being sure (100% probability) is religion, and, as a likely extension, as markers sent to outgroups. I am pretty sure that when you are in the lab leak group, you will hear that people discuss about the (high, very high, almost sure but not 100%) lab leak probability, while it's those stupid sheep that insist that it's a 100% natural zoonose that jumped to human without any lab being involved because it would be racist to say otherwise...
It is unclear to me what you are saying here. I think you are saying that you do see people on the non-lab leak side who express supreme confidence. I am sure such people exist. Possibly, people are just more likely to "dig in" when they feel like they are on the losing side of an intellectual debate.
As far as belief being some percent (>75% >90%), that wasn't really what I meant. I find lots of people say they are 90% sure but won't no amount of contrary information will make them budge from that. Effectively they are 100% certain.
I say that, reading you, I suspect that you are in the lab_has_nothing_to_do_with it team, and that's why you think the out_of_lab opinion seems to be a largely monolithic bloc of believer closed to discussion, while people thinking that the natural zoonose is more likely are more nuanced. People from the out_of_lab team will just have the reversed opinion: it's the people insisting on no lab involved that are monolithic fanatics that do not accept to consider fairly contrarian information, while in their camp new information are processed and their belief of lab leak is updated to take account of new infos. Nobody vocal will ever go from a >50% to a <50% opinion of course, and the minute adjustment will not be communicated to the other camp because they would misuse it to weaken their opponents in the eyes of the non-vocal bystanders
I'm confused about how you're using the term "conspiracy theory" here. Both cases you describe seem like conspiracy theories. In case 1 (Believing that Covid was caused by a lab leak), the claim is that a conspiracy definitely happened. In case 2 (Thinking that there is some, entirely debatable percent chance that it was caused by a lab leak), the claim is that a conspiracy might have happened.
Either way, a conspiracy is involved. That makes them both conspiracy theories.
It seems like you're just using "conspiracy theory" as a synonym for "something that is false", as it has come to be used by a certain crowd recently. But that is ridiculous. Conspiring is one of the most basic, fundamental human behaviors.
Only in a sense where any statement about a concerted action by a group of people is a "conspiracy theory", since if they acted together, it's a "conspiracy", and if you think about it, it's a "theory".
But that's not what is usually meant by "conspiracy theory" and definitely not what was represented as "conspiracy theory" with regard to lab leak hypothesis. What was represented is that it's a near certainty that it wasn't a lab leak, and we only say "near" out of scientific politeness because for all practical purposes it is as certain as any other fact we know about our reality, there was never plausible reason to think otherwise, everybody who supports the lab leak idea are freaks and fringe operators, there was never any serious science behind it and any idea that it could be plausible or should be taken as a serious scientific statement is preposterous, and anybody who brings it up should be laughed out of discussion immediately. They did not say "0%" explicitly, maybe, but they came within a Planck distance to it and put a huge billboard there saying "The Truth is here!". So I don't think the difference matters.
I see what you have done here. Us laughing at the absurdity of your conspiracy theory and paying it no heed _is_, in itself, a conspiracy theory. Of course, your conspiracy theory actually involves a conspiracy to hide and cover up this alleged event and by all of the scientists saying, "It certainly looks like it could be from natural sources." While our conspiracy involves us rolling our eyes and not following you down a rabbit hole. [edit: \s]
See how funny it works - you start with "nobody claimed there's 0% chance", and then you proceed to mock supporting the idea there's a non-0% chance, and discuss the "absurdity of my conspiracy theory" - mind you, you don't even bother to establish it's a "conspiracy theory" and why it's "absurd", you imply it is a proven and forgone conclusion and the only thing left for you is to mention this as an obvious fact - and imply your laughing at it, and rolling the eye on it, is the only natural response, and call even adressing any of the concerns, even bothering to substantiate anything "following you down a rabbit hole".
And with all that, you still never said the words "there's 0% chance" - so in fact your original claim is technically still correct. That's a masterful work.
"We stand together to strongly condemn conspiracy theories suggesting that COVID-19 does not have a natural origin."
I guess you could argue about the syntax here. But I interpret this as meaning: "all theories that suggest COVID-19 does not have a natural origin (eg lab leak) are conspiracy theories and we condemn them".
One could also argue that collecting viruses from bats in some cave, growing them in the lab and then accidentally infecting someone is still a "natural origin".
Yeah, there's no careful reading of that letter that makes it not an outright lie. There's no way to say that scientists “overwhelmingly conclude that this coronavirus originated in wildlife,” when there was none of the direct evidence then or now that we'd seen from SARS1 or from MERS to support that conclusion. This letter was a blatant use of manufactured 'consensus' to spread outright lies and subvert additional scientific scrutiny.
This was exactly the kind of outright lie - in the Lancet no less! - that Scott is claiming you're not supposed to get based on his 'how to read the media' construct.
"Our phones are made with all-natural semiconductors and all-natural Li-Po batteries, with an all-natural Gorilla Glass 9 screen and an all-natural plastic case. Get into the all-natural game, or explore the all-natural metaverse at blazing speeds on any all-natural social media platform. Or hook up the headset for an all-natural VR experience!"
Yes. That’s what all the fuss about “gain of function” research is about. Basically, the theory goes that the virus may have been originally from wildlife, but was intentionally modified to be more infectious in humans, and only then was it leaked.
To me "originate from" sounds ambiguous, it could mean leaking an exact copy of what they collected in wildlife, or improving the version they collected in wildlife (as opposed to designing a new virus from scratch). Not sure which one you meant.
I do not pay much attention to this, but it seems to me that people working in Wuhan were definitely doing the research of the latter kind, in general, and no one is even denying this.
The questions are:
1) Whether this is where COVID-19 actually came from... or whether they were working on a completely different virus that *didn't* leak.
2) Whether "people in Wuhan improving bat viruses to better infect humans" also included the American scientists... or whether the American scientists working in Wuhan were working on something different.
3) If the American scientists working in Wuhan were actually doing gain-of-function research, whether they were funded by National Institutes of Health despite the existing moratorium on such research.
And my impression is that the answers are:
1) They deny it, and it would be difficult to prove either way.
2) Yes.
3) *Technically* no; in the sense that yes those scientists got NIH funding, but on the paper that funding was meant for something different.
So the "no" side is saying "there is no paperwork about funding for gain-of-function research". And the "yes" side is saying "well, they *were* doing the gain-of-function research, and they *got* money from you, someone just made it seem on the paper that the money was for something different, but as we all know, money is fungible".
They are condemning the *conspiracy theories* suggesting it wasn't natural. Not the *well-reasoned hypotheses* about how it could have been unnatural. There were a lot of crazy conspiracy theories about the topic and were rightly condemned.
You are changing the words used to draw conclusions - this is just another example of what Scott was discussing.
I disagree with this characterization. It looks like you're doing the thing you're accusing Watson of doing, namely "changing the words used to draw conclusions". The plain language of the letter in part states that they "overwhelmingly conclude that this coronavirus originated in wildlife,2, 3, 4, 5, 6, 7, 8, 9, 10 as have so many other emerging pathogens.11, 12"
Note that those last two references are of other pathogens that emerged without passing through the laboratory, and that is how they claim SARS2 emerged. The authors of the letter did not leave room to conclude they were talking about a narrow subset of laboratory-based origins. The statement was not as ambiguous as you're making it out to be.
I stand by my characterization as the following sentences explicitly call out conspiracy theories and praise scientific evidence.
"Conspiracy theories do nothing but create fear, rumours, and prejudice that jeopardise our global collaboration in the fight against this virus. We support the call from the Director-General of WHO to promote scientific evidence and unity over misinformation and conjecture."
You are strategically quoting the letter to make it seem like the signatories are concluding, when in fact... "Scientists from multiple countries ... overwhelmingly conclude"
Nowhere in the letter do they ever say, in so many words, that lab origination theories are also conspiracy theories.
Yes, but read the papers they are citing. Those scientists do not themselves "overwhelmingly conclude" what is claimed in the Lancet letter. Therefore, that claim "scientists from multiple countries overwhelmingly conclude" is made BY THE AUTHORS, not by those they cite. So either the authors of the Lancet paper are directly making the false claim, or they are falsely attributing the claim to others so it sounds like there's consensus about a thing they wish to say.
It doesn't matter whether I say, "Everything on the internet is true," or I say, "Abraham Lincoln once said, 'everything on the internet is true." Both statements are false. The second one tries to piggy-back on old Honest Abe to give me more authority, but if anything that should count as a second lie, not absolve me of telling the first lie.
Either way, they went on to make the positive claim - not attributed to those scientists from multiple countries, since it comes after that long list of citations - that this happened "as have so many other emerging pathogens." No plain reading of this letter supports the contention that they support a possible lab origin of the virus. They go out of their way to explicitly rule that out.
"Nowhere in the letter do they ever say, in so many words, that lab origination theories are also conspiracy theories."
Part of the confusion here is due to how you're using the term "conspiracy theory".
If scientists (presumably more than one) worked together to achieve COVID gain-of-function, and then it leaked (by accident or otherwise) and then they kept that fact secret, that would be a conspiracy. They conspired to commit a harmful act, and kept it a secret. Theories about this happening would be conspiracy theories. So the scientists are explicitly denying the lab leak.
You seem be interpreting "conspiracy theory" not as "a theory that a conspiracy happened," but "some crazy theory only wackos believe that is false by definition". There is a motte and bailey happening here. The motte is "I'm only saying that the *really* crazy conspiracy theories with no evidence, like flat earth or inter-dimensional vampires, are false" and the bailey is "any theory that involves anyone conspiring in any way is false by definition."
"They are condemning the *conspiracy theories* suggesting it wasn't natural. Not the *well-reasoned hypotheses* about how it could have been unnatural."
Any well-reasoned hypothesis -- really, ANY hypothesis about how it could have been unnatural, would be conspiracy theories. If it is unnatural, by definition, humans were responsible for it, and then they kept their responsibility secret. That is the definition of a conspiracy.
Despite how the media uses the term, a "conspiracy theory" does not mean "some crazy, unsubstantiated theory," it means "a theory about people conspiring."
It's not an accident that the media has tried to merge these two meanings together, however.
The letter says "Scientists from multiple countries have published and analysed genomes of the causative agent, severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2),1 and they overwhelmingly conclude that this coronavirus originated in wildlife." That was a flat lie if, as James says, the emails show that some of the signatories thought a lab leak was a likely explanation.
My friends who are relevant scientists generally went from believing lab leak was likely to believing it was very unlikely due to the structure of the virus not looking man made - information that was not available at the start of the pandemic but rapidly became available.
Some of them have updated back to likely with the refinement to the theory that it's a natural virus from gain of function study rather than a designed virus - but that's also compatible with the Lancet letter which is only strongly against conspiracy theories that suggest that it's not of natural _origin_ - which a lab enhanced natural virus technically is.
People do sometimes legitimately change their minds due to available evidence.
> but that's also compatible with the Lancet letter which is only strongly against conspiracy theories that suggest that it's not of natural _origin_ - which a lab enhanced natural virus technically is
And if the letter had been explicit about what it meant by "natural origin" and that "natural origin" included the not-too-unlikely case of an animal virus which has undergone gain-of-function research (which surely they must have thought about at at the time) then the letter would have been very reasonable.
Lab Leak in itself suggest an accidental release of a non-genetatically-engineered virus. Designed bio-weapon virus was never seriously on the table, very early on virus made human-compatible through GoF was the hot topic. The absence of marker of direct genetic manipulation (vs GoF, which I understand as a kind of accelerated evolution under artificial selection) is imho a smoke screen, a diversion, which worked.
I agree that Scott is overly optimistic, even if he's right about the presence of some red line in the lying game. But as other mentioned, those red line are not only not official (Scott is well aware of that, hence the whole difficulty to guess them), they are not fixed in time, depend on the "expert" domain and in the end of each "expert". When you consider that most expert messages are not raw scientific literature, but the message of a specifically-picked expert reported by a journalist (which have other sets of red lines), guessing red lines become a loosing game. The CC example/warning of Scott is great: Sure, IPCC full report is only biased on interpretation and quite carefull/weasel not to directly lie. In fact, I think it's largely honest.... The policymaker resume is a very different beast already, and main media reporting is one level up...But 99% of the public see only the latest....Where are the red lines there?
Same for Covid scientific consensus...And even there, we need to be super carefull at what we consider the base level. I think it's direct technical articles targeted at other researchers, where lies and bias remain manageable...if you think the medical replication crisis remains manageable....But oped, summarizes and policy recomendations are on a very different level, even if they get published in scientific journals. Lancet gate anyone? And it seems the non-consensus researchers for covid are (still?) in larger proportion (and often have individual publication indexes / pre=2019 reputation) than the NC scienties of climate change. I say still because once politically loaded, being NC is becoming a poor carreer choice, so the local moral/group standard is subject to large external incentives that operate at career timescales, one of the largest factors behind red lines evolving. Red lines are imho largely fear of being caught pant-off by your peers...
"Red lines are imho largely fear of being caught pant-off by your peers..."....In fact, i'd like to push that further because it explains something I intuitively do: paradoxically trust the expert less the higher up in the hierarchy they are, and the more mediatic they are. In both case, it means their peer are not really the other scientists/experts anymore. It's the other managers/politicians in the first case, and the journalists/media in the second. Which have much poorer redlines than technical scientist...
One thing to consider is that Scott might not be overly optimistic, but that he never believed the Lancet letter, because it didn't cross the lines he has in his head, it is just hard to describe what those lines look like in writing. When all this was happening, (as far as I know) he never voiced his opinion on the lab leak theory, which to me says that he never thought the evidence was certain in either direction. I think Scott is well calibrated, but it is impossible to put in writing all the rules to need to follow to be calibrated.
I share your impression, but it's probably because I often share Scott's opinions on "trigger" subjects, or at least the opinions I attribute to him.
And that's likely because Scott's self-identified as Grey tribe, like I does. He seems left-leaning while I probably am right-leaning (European here, so trying to understand subtleties of US politico-societal categories is not easy), but still, we are close enough (so are most of the readers) to wonder if the well-calibrated impression is really good calibration, or just inner-tribe cozy feeling? I like to think that one of grey tribe characteristics is to be really scientific in the old-fashioned way, so to be especially calibrated to detect lies (non-facts presented as facts), so I lean to the first explanation...Still, even if i'm not mistaken by tribal blidness, it means that such calibration is not really possible (or at least much much harder to achieve) in other tribes. There you will not have any halo effect. Among the greys, your social circle will really encourage fact gathering and scientific method above other factors, and you have friends that will behave the same way and that you can rely on without doing all the work yourself - social trust is at least partially aligned with truth seeking. In other circles? Not so.
A blatant example is the Poverty EGG study: I don't think this piece could have been written by someone in the Blue tribe, even a blue tribe scientist.
And yes, I think that there are non-grey tribe scientists, and I think this is a problem, one of the factors behind the current issues even with scientific publication. Non-grey scientists means blue, red tribe does not really have any foot in the science playing field since the seventies, as far as I understand the tribes :=)
Greg. Gregory Cochran. In the blogroll. But yes, he admits the "red-pilled" need to keep their head down. He knows math, that may have made him last till to 2015, as a research associate.
I tend to automatically dismiss anyone who makes statements about a virus "looking man-made" or not, on the grounds that either he or I must be gravely confused about basic biochemistry. I can't think of any way to look at a DNA sequence, or a protein amino-acid sequence, or even a full 3D X-ray structure of a protein, and be in any position to say "Huh! Looks man-made..." I mean, unless there was a tiny (c) 2019 WIV stenciled on the fuselage somehow. These kinds of statements remind me of "Intelligent Design" lectures in which, say, the structure of a molecular motor is thrown up on the overhead and the speaker exclaims "Look at that! All these trusses and gearings -- surely that's designed by an intelligent mind..." I can't see much of a material difference between that kind of argument-from-astonishment and an argument that a virus looks man-made.
Of course, it could easily be some subtle higher-order analysis is being done here, maybe some kind of homology mapping to existing wild-type genomes or something, who knows? and so "looks like" is far more subtle -- but therefore far more open to interpretative differences -- than the naive interpretation of the world would suggest.
Depends. If it had a bunch of pseudouridines in it, yeah maybe. But if it really is "mRNA" meaning not some closely-related compound we're calling "mRNA" for convenience so we don't have to tack on eight syllables of organic chemistry prefix, no I don't see how one could tell. It's a more sophisticated version of the popular delusion that there's a difference between "artificial chemicals" in your food and "natural ingredients" because your fructose molecule was synthesized in a strawberry plant versus in a stainless steel vat at Archer Daniels.
I believe there's more to it than the pseudouridyl. The sequences are more highly optimized than what's found in nature, e.g. lots of synonymous codons are replaced with more optimal versions that have more C and G in them. It's not impossible for such a thing to evolve but it seems it didn't do so in reality.
My understanding is that early gene-splicing techniques did leave telltale signs in the DNA structure, and it is the absence of those signs that was used as the basis for the claim that Covid couldn't be man-made.
However, newer CRISPR techniques (which the Wuhan lab would certainly have access to) do not leave these telltale signs, so the fact that this one particular older technique wasn't used, doesn't really prove anything. (And the people making that argument should have been aware of that.)
It was expressed more as 'if a lab was designing a virus they wouldn't have done it like that' rather than 'it's possible to tell it's artifical if for some reason they also tried very hard to mimic a virus of natural origin', fwiw. I'm not a biochemist myself so I can't directly evaluate the veracity and I might be getting important terms muddled.
Yeah as I said that one strikes me as in the same class as the ID folks telling me there's no *way* the human heart couldn't have been designed by a superintelligent God-being because just *look* at how cleverly all the parts work together. I'm always underwhelmed by arguments from incredulity, or by the inference of human motivation and/or insight from the products. The human tendency towards anthropomorphization is just far too strong to trust that kind of argument. It's like the fact that if I don't find my car keys in the usual place, I'm basically driven by instincts to believe somebody moved them -- even if there exists many perfectly plausible alternate explanations, like my memory of where I left them is faulty.
Basically, by "it doesn't look man-made", they mean "it's either produced by evolution OR it's man-made by someone specifically trying to fake it being produced by evolution".
A fully-synthetic virus would have no particular reason to be extremely similar to pre-existing animal viruses, while a natural virus or a virus artificially derived from it would obviously have such a resemblance. COVID-19 is very similar to bat coronaviruses.
Patterns of codon use are another example. The genetic code has 64 codons but only codes for 21 different things; there is redundancy. However, for a variety of reasons natural genes are not distributed completely randomly among those codons; there is information there. Usually, artificial methods of producing genes use different signature patterns of codons than nature does simply out of convenience.
This method cannot produce a "definitely not man-made" answer, because man knows all of these signatures and can fake them if sufficiently motivated, but it is worth an update that it doesn't come back as "definitely man-made".
You're aware of fluctuations, right? The explanation for small-town "cancer clusters" and other surprises in the random variation of small numbers away from their expected value. The SARS-CoV-2 genome is only ~30 kb long, and the part that peopel would intensely study (for the S protein) ~3 kb.
"Fully-synthetic virus" was only ever brought up as a straw-man to argue against. No sane person ever claimed that Covid-19 was "written from scratch" or that it wasn't closely related to existing bat viruses.
But if they took a bat virus as their starting point and then did genetic engineering on it to make it more dangerous to humans, that would still count as "man-made" for the purpose of the lab leak discussion. Even if they left 95% of the virus as-is and only added one or two "features". Likewise if they did gain-of-function through a "guided evolution" approach, without modifying the genome directly.
So "the virus doesn't look like it was written from scratch, therefore we can discount the lab leak hypothesis as a crazy conspiracy theory" is not an argument that one can make in good faith. Yet it was made.
> My friends who are relevant scientists generally went from believing lab leak was likely to believing it was very unlikely due to the structure of the virus not looking man made
So why did they believe this was important? From the moment I heard of the Wuhan Institute of Virogloy I had a model in my head where natural viruses are bred in captivity and can possibly escape.
> but that's also compatible with the Lancet letter which is only strongly against conspiracy theories that suggest that it's not of natural _origin_ - which a lab enhanced natural virus technically is.
This sounds more like deliberate misidrection. "We are confident that it is not [this thing that was never a concer]".
At the time a lot of people were talking about it being a deliberate engineered bioweapon or attack, not an accidental experiment escape. That's the theory that the Lancet letter was shooting down, not the lab accident one.
The news article that purports to show hypocrisy on the part of some of the signatories of the Lancet letter seems to be constructed in exactly the way news orgs lie all of the time.
The evidence takes quotes out of context from emails that we can't see to make it seem like some of the scientists have high levels of confidence about a lab leak. But everything stated as confident in summary is then hedged in "may", "might, "could" in the detail.
These types of subtle nuances in 'bullshit-detection' calibration seem like a skill that is apparently very difficult for a lot of people to attain, and I'm thus pretty skeptical of any efforts to teach this kind of stuff in public schools in the form of 'media literacy' (although would love to see evidence to the contrary), it seems just as difficult to teach as it is to teach someone to be charismatic - a lot of subtle nuances that are hard to communicate as bullet points, and rather are best represented as complex multivariate distributions. I'm curious if anyone feels like they were 'taught' how to be really good at this by anything in particular.
Formal schooling doesn't even do a particularly good job of teaching things for which there are well understood, formal frameworks of right and wrong, like in the case of mathematics or foreign languages. There's not much reason to think it would be particularly adept at conveying a formal curriculum of media literacy. But it's an important soft skill that applies across a wide range of disciplines so you shouldn't be purposefully avoiding it either.
History at my school in the UK when I was about 15 was pretty good at this - primary and secondary sources, historiography generally. Not the specific detailed complexity in this post, but it felt like a good starting point for me. I think it ultimately helped me detect the newspaper and Government lies leading up to the Iraq war.
Yeah, GCSE History in the 1990s went pretty hard on historiography and sources and I think benefitted a lot from doing so.
It also has the advantage over trying to do it with current media that it's less controversial - if you're trying to tell people that the current President or PM is lying, then parents will complain; if you're telling people that some nineteenth-century President or PM was lying, they won't.
This was my experience though probably not until college (and not in the UK), but I definitely think it can be taught. Maybe a course on the history of political propaganda and the psychology of marketing should be required in high school now like civics was required a generation ago.
Rhetoric is also pretty useful for this purpose -- for mapping out the typology of logical fallacies and where they show up.
I think this would be hard to teach well for the same reason it's hard to teach people to detect lying. You can give some heuristics that will help, but a lot of it is System 1 stuff that you train through experience.
Even if it *could* be taught, a cynic might wonder if the authorities who design school curricula are actually interested in giving all students finely-tuned bullshit detectors.. Be careful with that thing, you might accidentally point it at our own side!
We should teach people to lie. Make games where lying like this is a skill. The best way to defend yourself against an attack is to have practice using it yourself.
I don't think that's likely to help. If you try to teach students math, do they all learn the same amount of math? If two people both learn to lie, but one has a deeper understanding of deception than the other, the person with the deeper understanding of deception is systematically more easily able to lie to the other and catch attempts at lying.
Also, everyone already learns about how to lie *to some extent* without formal instruction. Everyone already has knowledge of how to lie, the question is not whether they know how, but whether there are gaps between their knowledge of deception and others'.
Could you reduce the extent of the gaps in people's deceptive ability by putting everyone through formal instruction in lying? Maybe, if you put everyone through enough of it, but teaching anything is also an opportunity cost, and I think even modest gains here would probably demand a heavily lying-based curriculum.
Yeah, you’re probably right. “How can we teach ordinary people how to reason accurately in the face of dishonest leadership” may end up being a silly question.
it's been pointed out that many debunkers came from the field of stage magicians. perhaps teaching kids stage magic, which would be interesting, would help them understand the same concepts but in a more intuitive way?
I think this would probably have the same problem as teaching lying directly, but more so. Some people will learn to be quite good at managing deception, others won't, and the goal isn't to turn out some people who're exceptional at managing deception, but to close gaps in ability.
Stage magicians are often skilled debunkers, but among people who study stage magic, how many people ever reach the level of professional stage magicians? In fact, stage magic might not even be teaching them skill at managing deception at all; you might observe the same thing if it were purely a process of selection, where the people who're most talented at deception tend to rise to the top of the field of stage magic. If that were the case, teaching kids magic to improve their abilities at managing deception might be like teaching them basketball to make them taller.
I think it's likely that learning magic at an elite level does cultivate deceptive skills, but I also don't think it's the case that dabbling in stage magic does much to prevent gullibility or credulousness, so I don't think some mandatory magic education would do much to close gaps in deceptive ability.
Growing up in rural Texas, I'd notice parents playing this game with their children all the time. How far can the parent stretch a truth into an outrageous fabrication, and for how long, before their kid notices? They do simple stuff when the kid is young (say, around 4), and it can get pretty deep when they're teenagers and they can dare to try against the parent. When the mark notices, it's great fun. And the game is *always* afoot. And it's useful, as an inoculation against shysters.
There's a particular flavor of this game I see in Texas that seems to extend across the southern US, and I fully expect analogs of it worldwide.
I've observed this sort of behavior in a couple different cultural contexts and suspect that it goes a long way towards inoculating people against gullibility.
Having also spent a lot of time in STEM environments and noting the high levels of credulity in people in those environments, I've also wondered whether constantly assessing the trustworthiness and intent of an informant competes with the ability to deeply process the content of the information. That is, it might be hard to learn algebra if you're constantly asking whether Lang is trying to pull a fast one on you.
I actually had an extended encounter online a couple years back with a serial fraudster. I cut off dealings with them pretty early on without any loss on my part, but I spent months trying to prevent them from taking in tens of thousands of dollars in Kickstarter fraud and catfishing a guy out of his life savings. I succeeded in the first goal, but failed at the second.
The catfishing victim was a Russian man, and when I started trying to convince him that his "online girlfriend" was bad news, he actually thought that *I* was the naive one. We talked (and argued) about this at length, and the impression I got was that, growing up and working in a lower-trust society where he was used to the idea that a certain amount of dishonesty is necessary to get by, he was actually way *less* sensitive than I was to the signs of "this person is clearly too untrustworthy to deal with." Because he was constantly dealing with people who were engaged in various sorts of duplicity, but not screwing him over personally, he just accepted the idea "this is how people normally behave," and didn't think "this is a warning sign that they might do the same to me."
In some respects, he was an unusually emotionally vulnerable and gullible person, and I'm sure the average person in Russia isn't like this. But for me, it really hammered home the idea that adjusting to a low-trust environment won't necessarily make you good at spotting deception, and some people will continue to be very bad at it.
this is also the case somewhat when you lived in marginalized circles. Things that are "red flags" for other people, often erroneously, are just common things- and being able to distinguish "non-conformist with some unusual personal issues" from "scumbag" can be a legitimate challenege.
That's not really the way science or math works, when you're doing it right. Those are fields in which doubt is institutionalized. The way to learn algebra really well is to doubt *everything* you read in the math book, and sit down with paper and pencil and test it for yourself. Aha! You say x + x = 2x, nonsense! Let's try a few examples, ratfink. 2 + 2 = 2*2 = 4, hmm, check, let's try 3...et cetera. That's the way to learn the subject thoroughly and well, and anybody who masters math (or science) understands this.
Same with professional communication. I don't write *anything* in a journal article that isn't backed up, either with data and calculation right there, or a foonote to acre-feet of data and calculation elsewhere, going right back to the origins of the field. I don't ask the reader to take *anything* on my mere personal assurance, because my default assumption is that he is a hardened skeptic and will not believe anything without very substantial six-sigma proof.
But... sometimes, realizing you have the right kind of institutionalized doubt can leave you bias to kinds you hadnt considered... or biases towards others who you assume are kept honest by the same commitment.
Or.... and this is a more complicated thing... not realize that certai types of communication where one is not literally spelling things out are not dishonest.
I'm sorry, I can't even parse that. Can you give me an explicit example of what you mean? If you're pointing out scientists (or specialists of any sort) are just as gullible and irrational outside their area of expertise as any other schmo, sure, of course, no question about it.
I've seen the same thing in lower income urban cultures. More susceptability to some kinds of conspiracy theories, but more of a sense of suspicion against hucksterism.
It's interesting to compare the notion of evidence in science vs. law. There is considerable overalap, but there are differences too. The rules in law have evolved to deal with game theoretic issues that sciences mostly scrub themselves clean of.
The citizen watching the news should mostly be using lawyer like rules rather than scientist like rules. This is natural enough when we witness a bunch of ex lawyers arging in Parliament, but what happens when The Science becomes the news?
>We should teach people to lie. Make games where lying like this is a skill.
Allan Callhamer took care of that almost seventy years ago. And the current Diplomacy game over at DSL may be coming to a close in a few more (game) years, so if you need a refresher course in lying, detecting lies, and establishing trust when you and everyone around you is a liar, feel free to sign up for the next one :-)
"Data Secrets Lox", an ACX affiliated bulletin board. There were a few online Diplomacy games set up there when I was active, and a few more on Slate Star Codex back when it was active.
I would add to this good LSAT test prep classes. The reading comprehension sections are designed in part to penalize readers that treat phrases like "X mostly does Y," "X usually does Y," "X may lead to serious consequence Y," and "there is suggestive evidence from distinguished scientists at top universities that X could lead to Y," as synonyms for "X does Y."
There's a lot of widely available practice tests and it's easy to assess performance over time. I think after the first two tries anyone who has a competitive or vested interest in trying to improve their score will learn that there is a cynical mode in their brain they can and need to switch.
One of the most useful approaches to news and media I learned from The Last Psychiatrist back when that blog was active. "What do they WANT to be true?" I try to teach my kids to ask that question about every article they read and even to ask it about their teachers at school. I think it's a good framework for getting to the real facts in the world without getting too far into conspiracy rabbit holes.
An approach taught in my Indian philosophy class - read every article 3 times - first time without analyzing, second time arguing along with the author (thinking of examples which support the author), third time arguing against the author (thinking of counter-examples. logical fallacies) before coming to a conclusion
Most of the time "media literacy" stuff seems to focus on identifying sources and not on perhaps more important things like how things are phrased and what information is left out.
I think I improved greatly in this area via high school speech and debate competitions. These required me to develop an argument - often about real-world issues - that was then subjected to intense scrutiny and literally scored in a competition. It trained the skills of both sound, logical reasoning, but ALSO how to present your arguments in a compelling manner. As someone said above, practice giving misleading-but-not-lying speeches is a great way to see how it's done and learn to see it in others. And in a fake debate competition context where often you have been assigned to the side of the argument you don't even support (another HUGELY useful practice), it's low stakes so you don't have to burn bridges the way real world arguments do.
I credit debate with making me better at speaking and reasoning, more sympathetic to those I disagree with, and more able to detect and dodge rhetorical flourishes and misleading arguments. Strongly recommend it as a way that's usually available in American public schools to train critical thinkers.
By contrast, i would say from my experience Model UN is not good at this. People give the same speeches, and it's really about negotiating as to who is the leader and who can take credit and jump on bandwagons, nothing really about the arguments themselves, and its very, very boring.
I learned this by having bad parents that manipulated me all the time - it took me 30 years to figure out them, but its sure helped me understand a lot of other interactions.
I was also thinking 'bullshit-detector' and 'media literacy' are what Scott is talking about. No idea, if it could be taught. But let's try! I read an interesting article years ago about things that can be learned, but not taught. Things can be learned by observing experts and experimenting, but not formalized. The example in the article was Chick Sexing (determining the sex of baby chicks). Something like that could probably still be taught in schools. It would just have to be showing lots of examples and having students read and analyze things.
I've argued for a long time that the critical skill not being taught is evaluating sources of information on internal evidence. Conventional school anti-teaches it. There are two sources of information, the teacher and the textbook, and you are supposed to believe what they tell you. Browsing the web, better yet getting involved in arguments online, is better. Anyone not braindead can see that the web is an unfiltered medium, hence the fact that someone says something online is no evidence it is true, so you need to develop ways of deciding what to believe. If you do a bad job of it you get embarrassed when you have been arguing that Adam Smith was in favor of public schooling, because someone said so online, and someone points you at the actual passage that makes it clear he wasn't.
but its definitely not a "bulletpoint" thing- its a "mode" of interaction that requires sorts of "parallel" information that can't be easily learned from the way rationalists usually learn things.
A lot of it requires being able to use compartmentalized beliefs- like how to "intepret" other's signlas in a way that suits your own purposes and then shifts the conversation in that direction.
A caveat, I am "geschwind type" neuroatypical, which i see as the opposite of autism on the spectrum- still it took time to get from "non-verbal information I am consciously aware of" to "mode where I can respond to that dynamically without actively processing it"
Re the Swedish piece-they have rules about what kind of research is ethical, and you have to have certain kinds of permits. The argument is about whether the authors followed the rules. I think they did the follow the rules, and the prosecution is foolish-as the initial prosecutor did! For Scott to make this about "the establishment"--just really weird. Certainly doesn't make the point it seems he's trying to make.
Scott is suggesting that the establishment is more likely to *notice* this sort of infraction - the fact that rules weren’t perfectly followed - if the establishment doesn’t like the conclusion. Had the results been different, nobody (or fewer/less credible somebodies) would have complained to the authorities so there would have been less reason for an investigation to happen. Bias is introduced both at the reporting level (if one side is more likely to complain) and at the response level (if one side’s complaints are more likely to be taken seriously and turn into prosecution) even if the underlying rules about what *should* happen seem perfectly clear and objective and even-handed.
What do you mean by "if the result had been different". It seems reasonable to think that if the results had been that native, Caucasian Swedes offended at a disproportionate rate, it might have been just as likely for there to have been an investigation, no?
Sure. But I wanted to know if you would predict an investigation if both were found equal?
I think your example (natives found more violent than immigrants) is avoiding the strongest form of the argument. If you do think an investigation would happen if both groups were found equally violent, then that clarifies the disagreement.
Ah, that's helpful! Thanks! I don't know how I'd normalize the distribution. In that case, I think I agree with you. I'd have to go back and look at the story again to be sure, and I don't have the time right now.
But that's basically just saying that studies with interesting results get more scrutiny than uninteresting studies. That's obviously true - interestingly-sounding studies get more media attention and reach more people so there is more chance they get into the sphere of attention of some regulator. That's very far from saying that studies whose results support anti-establishment ideologies get extra regulatory scrutiny (still a plausible claim TBH).
No, it's saying that studies which confirm a certain set of priors tend to skate, while studies which tend to cut against those priors get scrutinized.
I refuse to believe that the Swedish authorities are actively poring through all the published papers that *anyone* has written looking for this kind of violation.
The only reason charges were brought in this case is that some busybody - likely a fellow academic - ratted him out to the powers-that-be. And the most obvious reason why that might have happened is that the complainant found the paper’s results offensive and was looking for a way to discredit the author for culture-war reasons.
If the paper’s findings are what caused a complaint to be filed, then a paper that either found no significant disparity or found a disparity in a direction that *reinforced* the dominant narrative would have gone unchallenged or at least would have been challenged *less* forcefully by *fewer* people than this paper was…which would substantially reduce the odds of charges getting filed.
That conclusion is inherent in the phrase “dominant narrative”: what it MEANS for a narrative to be dominant is that support for the narrative passes unchallenged while opposition to it does not, no?
The only way charges would have been filed if the paper had had different findings is if this were *personal* - somebody had an existing grudge against this particular researcher for some *prior* offense and this paper *incidentally* offered them a chance at payback. But my money’s on the other option. If we had a parallel world to run the experiment in I’d offer 20:1 odds the finds-no-difference paper passes muster with no legal challenge.
In modern society, a lot of things are illegal. Most people have done something for which they *could* be jailed.
Most people are *not* jailed, and not due to courts acquitting them, but because they're never indicted in the first place. Some of this is due to nondetection, but a large part is due to prosecutorial discretion. That is to say, a prosecutor can choose what he/she does and does not take to court. Note that there is very little accountability for this discretion; cases that don't go to court are normally invisible, and cases that do are usually seen as reasonable because the suspect is guilty (due to the first point: everyone is guilty).
When everyone is guilty but not everyone is prosecuted, prosecutors can use their discretion to pursue ideological projects by selectively jailing people they don't like. This is what Scott is alleging; that prosecutorial discretion would have spared someone whose study had the opposite result. (This is *very* hard to confirm or refute, which is part of the problem.)
"I think, I think, I think." You know, Scott, if you had even an iota of data here, instead of your unbounded faith in your own gut intuitions (aka priors), you might have something valuable here. All you're saying here is "If my unsubstantiated belief 1 is true, and unsubstantiated belief 2 is true, boy is that ever outrageous!"
For those interested:
I think any researcher who found that immigrants were great would not have the technicalities of their research subjected to this level of scrutiny, and that the permissioning system evolved partly out of a desire to be able to crush researchers in exactly these kinds of situations. I think this is a pretty common scenario, and part of a whole structure of norms and regulations that makes sure experts only produce research that favors one side of the political spectrum. So I think the outrage is justified, this is exactly what people mean when they accuse experts of being biased, and those accusations are completely true.
This doesn't seem to be necessary (i.e. doesn't seem to be making any sort of interesting claim which would justify the ways in which it's a bad post).
"Finally, the Marx thing was intended as a cutesy human interest story (albeit one with an obvious political motive) and everybody knows cutesy human interest stories are always false." It seems he kind of does. But largely, I agree that there's more nuance to be had in dealing with the various tentacles of a given media apparatus than was conveyed here, esp wrt to Fox
I cannot find the word "opinion" anywhere on the Washington Post article about Lincoln. The URL suggests it is in the "history" section. I agree there is some vague sense in which it is more of an "opinion" piece than the election reporting, but separate from an obvious THIS IS AN OPINION FLAG, that's exactly the kind of not-universally-understood heuristic I'm talking about.
Would you call the poorly-reported childhood EEG study I blogged about recently in the NYT an opinion piece or not? If yes, how is it different from any other science reporting?
For the Lincoln/Marx piece we see it's on "Retropolis", and "Gillian Brockell is a staff writer for The Washington Post's history blog, Retropolis." So, we are on a blog, which is very much not an "article."
These distinctions are really important for understanding what you read.
I see this as pretty much reinforcing Scott’s point rather than diminishing it, though. “The Washington Post will tell different, somewhat more brazen lies in their blog section” is the sort of mostly-reliable heuristic that you need in order to have any chance of discerning the truth value of the news.
Yes, it reinforces the point of this piece. I just wish Scott would take his own lesson to heart and make an effort to use precise and correct words for published content, as these really do matter.
In a sense I think it almost shows the opposite. The Post thinks of the distinction between the blog and the news as the kind of transparent and legible distinction that makes things clear. But there’s a lot of redundancy in the signal too - the blog and the news have different kinds of stories and are written in different styles, so that even people who fail to pick up on the transparent signifier can still develop the kind of useful heuristics that lead them to understand where different levels and kinds of credibility attach.
How likely is it that the average reader is making that distinction? I also don't see any reason to excuse the Washington post for publishing potential falsehoods just because its on a "blog". We are right now commenting on a blog. Would we excuse Scott if he published complete falsehoods and lies?
We understand that Scott's blog is Scott's responsibility only; we don't go and blame SubStack for lies on Scott's blog. Similarly, no one here is defending blog author Gillian Brockell.
A "blog" is the author's writing with minimal oversight. If it got a bunch of editing and fact-checking it wouldn't be a blog. The WaPo would call similar writings that had been through the full editorial process something else like say "features."
Now the WaPo doesn't literally have zero responsibility for the blog - they chose to hire this person - but the organization don't "stand behind" blog writing in the same way that they would for real news articles.
But again, how likely is it that the average reader is making that distinction? I don't think your interpretation is the one held by most people. Blogs are no longer just places to write opinions or thoughts, they are often at the forefront of new reporting and are cited by mainstream news outlets all the time. I would also argue that, thought its called a "blog", this one in the Washington Post doesn't actually fit the commonly held view of a blog as a private place for one or a group of people to publish their writing. In this case, WaPo controls everything about the blog except for, presumably, the topics covered in it. But publishing, marketing, distribution are all covered by WaPo. This seems much more like a column to me or at least a distinction without a difference.
It's interesting that you analogize to a column. I think even a below-average reader understands that George Will's columns are not backed by the Washington Post in the same way a news article is.
I don't know for sure, but I'd guess that the average subscriber to the Washington Post understands the difference between the blog content and the straight news content, but the typical person just clicking on WaPo links via Twitter probably doesn't.
With respect, I'm not going to take the time to find an unlinked story. Also, are you calling for big, bold labels "THIS IS AN OPINION" and "THIS IS A NEWS STORY". Because that was what I took away from your post.
Back in the old days.... we had news pages and opinion pages and they showed up consistently in their respective separate places in the newspaper. And TV news had its own very structured and consistent format. 60 Minutes was like revolutionary for providing a mix of (sort of) news reporting, analysis, and some other just goofy shit all on the same show. And in terms of news consumption, the average (U.S.) public took in maybe three sources of news at most, all with these stable and familiar formats.
And then the internet and endless cable "news" TV happened and we've had two generations now of people who don't have these earlier reference points deeply ingrained into them. Stuff of both flavors -- news and opinion -- shows up everywhere all the time. And then print media and cable news, now needing to provide ever more flavors and variety of content spawned all kinds of in-between-y formats that get called things like "essays" or "analysis" or "blogs" or "topical newsletters" or "explainers" that are neither news nor opinion pieces, and are often huge amounts of nonsense chasing ad revenue.
I remember when Vox first started publishing "explainers" I would have to refrain from emailing them my ranting frustration that their explainers had all these subtle biases imported into them and how much more insidious that is than other kinds of "news" reporting because people didn't have the skills to interpret the bias of the explainers where we sort of had skills to interpret the bias of news. But that was a long time ago now too.
Scott seems to be writing from inside the first generation that didn't experience the predictable clarity between "news" and "opinion" as it played out in the more limited forums we had back in the old days. And so Scott seems less clear about the distinction but also the distinction is so much less clear than it once was, and it's only Gen X and older who would have the same kind of internal reference point for how this all used to feel, which is so hard to describe now relative to how it actually is.
(and I did have to walk to school backwards uphill with cardboard strapped to my feet)
A dimension of this mess that Scott is not touching on here is the whole „so who is an expert“ quagmire. Think of the Covid fiasco, and the plethora of „experts“ on all sorts of things it brought out of the woodwork. For people who are struggling with understanding a complex situation, it’s often not a trust the experts vs. distrust them situation: it’s „who are the experts in the first place“?
Exactly. And once you have in your toolbox "this guy is not really an expert" + "he's not really lying literally" + "he's literally lying but that's just part of the game", seems like you have too many degrees of freedom when calling things "not total bullshit".
Exactly. Look at the google scholar page of someone like Peter McCullough. Sure, he’s not a virologist, but he is an expert in a relevant field and frankly has the credentials and publication history to back that up. Yet, he goes against consensus expert opinion. I’m not saying he’s right, but I can’t exactly dismiss him as not being an expert. Now, medicine isn’t my area of expertise, but I am a scientist. I have the sense to at least look at someone’s publication history. The average person is not going to be able to do that.
So you have another issue here: consensus vs dissenting voices. I’m personally in favor of hearing out dissenting voices, but I must admit, I’m having more trouble establishing what’s true and false in the current climate than I would like.
"That's probably a bigger lie (in some sense) then one extra mass shooting in a country with dozens of them"
bigger than
And
"people can’t differentiate the many many cases where the news lies from them from the other set of cases where the news is not, at this moment, actively lying."
It's not literally true that "experts in the field won't lie directly". There are two ways in which experts in the field will totally lie, and do so all the time. First, they'll be mistaken (maybe you don't count that as a lie, but from the point of view of an observer it can be functionally the same). For any proposition X, there's some distribution you get if you ask experts "how likely is it that X", and there'll be some (hopefully small) fraction of experts who are just wrong. Second, there's some fraction of experts who lack scruples. It can be a small fraction, I don't care, but it's nonzero, and so you can always find an expert to go on a podcast and blather any claims that you want.
This wouldn't matter, except now the other experts (who aren't grossly mistaken, and who have scruples) are likely to become in a sense complicit. "Not lying" is much easier than "calling out a lie". There are many reasons not to call out a lie --- political inconvenience, being associated with icky people who also call out the lie, not having enough time. People don't generally think (even if they claim otherwise) that there's some strong moral requirement to put your career on the line to correct some false statement made by a supposed "expert" in a paper, or online, or in the news. It's easy to justify inaction by saying "oh, the lie was of little consequence", without noticing how often that really means "of little consequence to *me and mine*".
The result of all of this is that if you consume the news, or the scientific literature, you can in fact be consuming outright lies. The small fraction who are grossly unethical, or outright stupid, make the lies (crossing the line!), and then others are reluctant to do anything about it (not crossing the line).
This doesn't invalidate the central idea of "bounded distrust". It's still the case that a sufficiently extreme lie ("the normal distribution posits that there is a normal human", or whatever that was) will receive substantial pushback --- although note that even there, people were reluctant to be associated with Razib Khan, and so took their names off of the petition! But this does move the invisible line of "things that are just not done" a bit further in the direction of dishonesty. What matters isn't so much what the median expert will *do* as what the median expert will *tolerate*.
From my viewpoint, it looks like the median expert will tolerate quite a lot of dishonesty, as long as it's "not of any consequence (for me and mine)". This varies by field, of course, as some fields have more of a culture of rudely calling out bad claims than others.
Other collective effects also reduce (in my eyes) the trustworthiness of amorphous "the experts". Just one example: who are the experts? Unscrupulous and incompetent researchers can create, by exploiting the politeness of their peers, an entire body of poor literature (here I'm thinking of "near-term quantum simulations", but there are plenty of others!). Now if I want to query the experts about this literature, who do I ask? The people who write papers about it? Not a good strategy, but it's very difficult to know who the correct expert to ask is. Should I ask "the inventor" of mRNA vaccines about the properties and effectiveness of the Pfizer/Moderna vaccines?
The upshot of all this is that if I have a friend who knows something about a field, I'm not particularly sensitive to all these collective effects, and I can reliably extract quite a bit of signal. If I'm relying on observations of the behavior and claims of "the experts" and "the journalists" and "the politicians", then even under optimistic assumptions about their individual honesty and competence, the amount of available signal is substantially reduced.
The World Socialist Web Site did a fine job of rounding up five famous American historians to denounce the bad history in the New York Times' "1619 Project." So, it's not always impossible to get real experts to speak out.
The concept of "everything" seems to very easily morph into the concept of "anything" in people's minds without them really noticing the difference, i.e. "You can't believe everything you read" becomes "You can't believe ANYTHING you read" and is defended with arguments that only support the former statement, not the latter.
If you don't have a reliable way to tell the non-trustworthy things from the rest, then they are actually the same. "Some of the oranges in this bowl are dangerous to eat" implies "you shouldn't eat any of the oranges in this bowl", if there is no good way to tell the bad oranges apart from the others.
If you assume there's no way to tell the difference between any two things, then you're defining "everything" and "anything" to mean the same thing, which works for your argument but plainly contradicts the meaning of the words.
I'm not defining them to be the same thing, I'm just saying that as long as I cannot tell the difference I have no choice but to treat them the same. I'm not sure what we're actually disagreeing on here.
Martin is right: he's not conflating everything with anything in general, but only on trust. If you have a bunch of facts and you know some are true and some may be false, without knowing which is which, you are forced to say all may be false so none can be fully trusted.
not if you do the bayesian thing where you have different confidence estimates for beliefs and correlate that with risk/reward benefits of having the wrong/right belief.
Like "some berries are poisonous" might be a good reason not to eat random berries off bushes if you arent sure which are which.
On the other hand, something like "i cant be sure how much mold is on this fresh fruit, so i'm not going to eat any fresh fruit" is a bad heuristic to have.
On a but of a joke note, I do like to imagine people playing this game with dangerous recreational drugs.
Is trying fentanyl a good idea or a bad one? I use to think about how anyone could possibly think trying heroin was a good idea; like thinking about it and saying "gee, is this likely to result in a good or bad outome?"
Of course the reality is by the time people are ready to try heroin or fentanyl they are already deep down the rabbit hole of bad heuristics
True -- but still, it would be fair to say that all the oranges are dangerous. If one orange in a bowl of ten is poisonous, but I can't tell which is which, then from my perspective each of the oranges has a 10% chance of killing me, and is thus a dangerous orange. (Even if I may still need to risk it, if the alternative is certain starvation.)
Likewise, if 10% of news articles are highly misleading to the point of being basically false, but I don't have a good way of identifying the misleading ones, then it's fair to say that from my point of view all news is untrustworthy.
In both cases, the fact that there may exist other people who are better at identifying which of the oranges is poisonous / which of the news articles is trustworthy, and who can thus safely consume the rest, is of no help to me.
Idiomatic for me! It's like when there's something playing in the background that you're not intentionally paying attention to. I'm also reminded of Paul Graham's coinage of "ambient thoughts" from http://www.paulgraham.com/top.html .
For me the strangest thing about that sentence was the existence of a TV at an airport gate. Hospital waiting room, sure. But I've never seen TVs at the airports I most frequent. Now that it's been brought to my attention, it looks not so different from a hospital waiting room on the relevant properties, so I wonder why they are not there (although I personally prefer their absence).
I see TVs at airports all the time (invariably playing CNN) and often have to do a lot of work to find a spot where I can sit without seeing or hearing them.
What airports do you go to? For me, the biggest advantage of having the status that gets you into the airport lounges is that it means you can find a space that doesn’t have a tv playing CNN airport edition.
I almost only travel within Europe. I vaguely remember watching part of the Olympics at an airport when I visited Canada many years ago, so indeed I have watched TV at an airport at least once, but these don't seem to exist in the places I know on this side of the Atlantic.
It's intriguing that TVs are added to waiting room-style places even though this is not an obvious boon, since there are costs associated with having a TV on 24/7 (or however long the waiting room is open). In cities with exactly one airport, you can't really decide to visit or avoid an airport on its TV-having status, so this shouldn't really be a consequence of competitive pressure... unless the whole point is that this started in closely clustered airports and grew from there?
Really, I'm intrigued. Why do something rather than nothing? Were there riots or loud talking in waiting rooms without TVs that lead to someone having this idea? Was it that someone working at the doctor's office / airport / place was bored out of their mind and convinced their boss to install a TV and this somehow became mainstream?
Great article examining something common that usually isn't thought about explicitly. I think trust is in most situations contextual - I know people who I'd trust not to steal or lie but not to show up on time etc.
For the political implications, I think trust and power are closely connected, because in a sense if you trust some one you give them power over you, since they can then control what you believe, which will then influence the choices you make.
Where this gets dangerous is not so much people giving up and trusting no one. It's when someone comes along with the message "all institutions are bad, trust no-one but me" and people believe them.
Because at that point because trust=power, they have quite a lot of power. You could do almost whatever you want and people will still support you. For example you could say that you didn't lock up your political opponents - they committed crimes. Or that you didn't overturn the election - you just found fraud. Or that you didn't start the war, or the war was necessary.
With you mostly but I was waiting for you to acknowledge you were wrong about Ivermectin and why
That you didn't says you haven't moved with the times and although your general perception of the paradigm's workings are correct the specifics have changed and that is why you are still applying the old rules
I regard conspiracy theorists a bit differently. This theory basically says that media is an interpretive process, effectively an act of mutual interpretation between broadcaster and receiver. The broadcaster is trying to convey what they want the other person to believe. So far we agree. But you pose the receiver is trying to determine what is true and what is false in the broadcast. Conspiracy theorists are people who are doing this badly.
I don't think that's true. I think the receiver is trying to determine what they should personally do. They're not actually invested in truth or the institution of news. (I suppose this makes me overly cynical since it means NEITHER side is invested in truth.) For example, take the vaccine stuff. The news is trying to broadcast the message the vaccine is safe, necessary, etc in an attempt to get the person to take the vaccine. The receiver isn't fundamentally trying to determine whether any of this is true. They are trying to decide whether they will take the vaccine. Whether they should socially pressure other people to. And so on. Part of that is undoubtedly determining whether the news is telling the truth. For example, if the news reports the vaccine makes you grow wings and no one's growing wings then that's pretty relevant. But only a part and it's certainly not a necessary condition.
Once a person makes a decision they construct an epistemology that justifies this decision. Or alternatively they already have an epistemology and it creates the belief. That's complex. Regardless, this is true for both broadcaster and receiver. Conspiracy theorists are people who construct epistemologies focused around conscious deception (a conspiracy). Like most epistemologies it's communal rather than individual. This creates a social-cultural network/pattern. Which of course the broadcasters and non-conspiracy theorists have too.
The conspiracy theorist's centrally unfalsifiable claims is both powerful and handicapping. Because it's unfalsifiable and often totalizing ("everything is Illuminati!") it makes it difficult for them to effectively achieve their ends. Even when they win it often doesn't achieve what they want. On the other hand, this is an ideal way to spread and maintain itself. Someone with concrete goals ("get everyone vaccinated") must eventually come to their end. Someone with a vague unachievable goal ("eliminate the Illuminati") gets to flexibly gloss over policy details and apply their lens to every situation. And they never has to deal with the goal being achieved. Victories and defeats occur but never the ultimate victory or defeat. And this fight can pay pretty direct benefits to its members. Sometimes even on a society-wide scale.
In summary: Conspiracy theorists are not failing at being mainstream. They're succeeding at being conspiracy theorists.
For the record, while your overall point is interesting, the choice of examples (Fox News, immigrant rapists, ivermectin) is sort of annoying and leaves an aftertaste. Those topics are sort of emblematic of an intellectual niche which is, to be blunt, AMPLY covered by other outlets.
No, you just don't like these things so you think he shouldn't have brought them up. But please, PLEASE, show me these other outlets where immigrant rape statistics being effectively censored has been amply covered? It REALLY REALLY sounds like you just don't want people knowing these statistics, especially given your track record on this blog (i.e. literally calling mainstream behavior heritability research "1920s eugenics")
Much less of this, please. I am also interested in an answer to the question. I think there's a small chance, but nonetheless worth investigating, that you might not actually be able to find all that many, to your own surprise.
What other examples would you have used? I need to use something where the media is biased/lying and people are angry about it, that kind of by definition means culture warrior-y stuff that makes lots of people angry.
I like ACX because you always do a good job of finding interesting examples that are off the beaten path, things I would never have seen or thought of.
I also think they are fine. Another case where you can get angry is when medias speak about something you really know. Sometimes because it's your area of expertise, sometimes because it's about you, a close one (or just simply someone you personally know) or your neighborhood. This last case is super enlightening because it's so directly brutal and gives you a lasting lesson about how much you should trust the media.
A frequent reaction is to want to punch the journalist in the face. Not always, sometimes it's the other way around, but to really appreciate the reporter in the second case you need to have experienced the first kind, just to see how bad it could be :-).
Unfortunately, this kind of personal expertise does not leads to good example, as people not involved have by definition very little knowledge apart from what is reported.
I think that your examples are fine within a certain context and framework. That said, when you say "What’s the flipped version of this scenario for the other political tribe?" there's an implication there that this is an essentially equivalent case, only flipped for tribal politics, but here we can see that the left lies while the right does not.
I didn't take you as intending to convey that impression in your essay. But, from the time I've spent on the SSC subreddit from before the culture wars content was split off into its own sub, I honestly do think that a non-hypothetical, and probably quite large portion of your reader base would interpret the essay in exactly that light. Either "Scott chose these examples because he wants to make the point that the liberal media lie more than the conservative media, because he's a conservative and naturally wants people to think that," or "Scott chose these examples because he's correctly pointing out that the liberal media are fundamentally more dishonest than conservative media."
I don't agree that people should think that Scott is saying the liberal media lies more or is worse than conservative media. He starts off the post with the assumption that Fox News is loose with the truth and that most people agree with that assumption. I think the main point he's making, one that has been reinforced over quite a few articles, is that almost ALL of the popular media outlets are fairly dishonest and how much of a problem this is.
P.S. this is my first post after being a long time reader.
I don't think people should think he's saying that. But my impression is, this isn't just a theoretically plausible way people might read the essay, but a way that a significant portion of his audience does read his work in practice.
I was (and to an extent still am) a regular commenter on the SSC subreddit for years, and I've taken positions arguing from both the left and right on different subjects on numerous occasions, so I feel it's given me a sense for what the political skews among that portion of his audience base at least actually look like.
I'm torn on how much the audiences interpretation should matter in this case. As a long time reader I've always gotten the impression that Scott was someone who had left of center beliefs but who was more focused on finding the truth of issues and looking for common ground than someone who was hung up on toeing the party line. Given this framework, I think he is more concerned with liberals who view anyone who doesn't agree with the media-narrative-du-jour as a troglodyte than with the conservatives who will use this as an excuse to dunk on liberal media for being loose with the truth. Given that the vast majority of the media leans heavily left (I hope we can agree on this point), I think it makes his choice of examples justified.
I think this may be more Scott's focus, but in my time discussing politics on the SSC subreddit before that was split off into its own community, I spent about equal numbers of conversations arguing from the left and from the right on different positions, and in my experience, there was a really large and unsubtle difference in the pushback and vote scores I got depending on which side I was arguing. When I argued from the left, I would get *dramatically* more pushback, and lower vote scores, than when I argued from the right, despite the fact that I adapted to this by putting more effort into my arguments when I argued from the left.
So, I think that given that context of his audience base, being concerned for readers interpreting his writing through a lens of "of course, the left wing is obviously way more untrustworthy than the right wing" strikes me as fairly well warranted.
Your examples are fine; you have just succeeded in making a few people uncomfortable, which is exactly what should happen when you’re criticizing institutional patterns.
Viz., the guy who started this and thinks that it’s not respectable to cover the culture war (and then immediately lashes out at responses with thinly veiled ad hominems), or the guy who thinks your writing doesn’t show sufficient both-sides-ism.
Critically, this is also true of media that claims it is intended to inform. Finding something that actually *is* intended to inform is 90% of the battle.
I think it’s important to interpret these cable news channels as an appropriate mixture of the two. They very much are not doing what HBO or even NBC Must See TV or whatever is doing. They’re a lot closer to Us Weekly, which is giving you infotainment of a sort.
Well yeah. That's because most media consumers are far more interested in entertainment than information acquisition per se. The purpose of most human conversation is to buttress pre-existing intuition and signal community belonging than to genuinely exchange data. Like most social species.
It's the most important reason. Follow the money, always a good first rule of evaluating human transactions. We all gotta eat. So if you want to know why Vendor X produces Product Y, ask yourself what his consumers want, and the answer is almost always Y, even if they *say* Z. (And if they *do* say Z, you'll also find a robust industry of people who are selling "This Z is really Y if you think about it" stickers.)
I don't assume people want influence for the sake of influence per se. The genes of Ghenghis Khan are relatively dilute by now. But "influence" = "sales" and "sales" = "my mortgage gets paid and maybe I can buy a new iPhone" and I think that's what dominates the actual thinking. We are all descended from umpty generations of humans who were very successful at convincing the tribe that when the food ran short *our* contributions were very necessary so let someone else starve. That instinct is deeply wired into us.
I would have agreed 100% with you pre-covid. Now I only mostly agree. Using only "follow the money" I would have predicted the situation to be back to normal much faster even if it cynically means slightly more people dying in the retirement homes. Nope. It seems that measures with involving strong and wide money loss were taken. That politics had much more relative power to economy that I though, even in the west....Or maybe I am not subtle enough at following the money.
I now think I have under-estimated the non-monetray current in modern western world. Metoo is another thing. Where is the money flow there....
I think ideology is not dead in the nineties. It took a nap, but it is back, in other forms.
I'm not disagreeing there are other factors at work, certainly. I did say it was the *first* rule, but not the only rule :) But let me also suggest that often it can be quite challenging to follow the money, so to speak. That is, the *way* in which this or that situation can be profitable for the people encouraging it can be fairly byzantine, hard to untangle. For example, *who* is losing money because of those measures? When you examine that question, it often seems to me there's a suspicious imbalance: it tends to be the people who are not part of the decision-maker's in-group. And let us also remember that just because people *on the whole* are becoming impoverished doesn't mean any one particular group is. There's such a thing as war profiteering and short-selling -- you can become quite rich in ways that exploite the descent of others into poverty.
I'm a professional media critic. My assumption from decades of close reading of the New York Times is that if I read a statement in the Times, it's very likely true. For example, if the New York Times tells me an Asian woman named Michelle Go was shoved to her death on the subway tracks by a man named Simon Martial, I'm sure that's true.
If the Times were to tell me Simon Martial is white, I'm sure they wouldn't be lying.
On the other hand, the Times finds some other facts are not fit to print. In particular, the Times does not like to go out of its way to raise doubts in the minds of its subscribers about their general picture of who are the Good Guys and who are the Bad Guys that they've developed over their years of relying on the Times for news.
Therefore, in both Times articles I've read that mentioned that victim Michelle Go is Asian did not mention the race of perp Simon Martial.
Coulter's Law states that if the news media report on an outrageous crime but don't let you know the race of the perp, he's usually black and almost never white.
More specifically, the Times has heavily promoted the theory that violence against Asians is due to Trump saying the words "China virus" a couple of years ago. This is a popular idea among The Times' paying subscribers. An alternative hypothesis is that misbehavior by blacks (e.g., shootings and car crashes) is way up since the mostly peaceful protests of the racial reckoning.
But most subscribers do not want to hear evidence for that. To even entertain that idea would raise serious questions about who exactly are the good guys: Is the Times itself a bad guy for promoting a bad idea -- Black Lives Matterism -- that has gotten thousands of incremental blacks killed violently since 5/25/20? Most of the Times' millions of subscribers are quite content with their notions of who are the good guys and who are the bad guys (Trump and Trump supporters) that they've derived from reading the Times and might not renew their subscriptions if the Times itself were to print more facts challenging the worldview the Times has inculcated in them.
But it's even more complicated than that: many Times reporters are excellent and would prefer to report the full story. So, what I've often noticed, is a frequent compromise between the marketing needs of the Times to not trouble subscribers with unwelcome facts and with the reporters' desires to publish interesting facts. Often, if you read NYT articles all the way to the end, you'll stumble in the later paragraphs upon subversive facts that, if you think carefully about their implications, undermine the impression the headline and opening paragraphs give. Of course, most subscribers have stopped reading by that point so they never notice.
I appreciate your comment on the subversive facts hidden 3/4 of the way through an article. Do you think it’s that the editors stop reading at the 2/3 mark, so the writer knows whatever’s at the end will get through? Or is the editor letting it through, based on the “no one reads this far” approach, so they can safely give the writer what the writer wanted?
I think people who work for the New York Times are mostly really good at their jobs, so, yeah, I assume editors definitely read all the way to the end of articles they are editing.
I imagine that unwelcome headlines or topic statements could elicit emails from the Marketing department saying that focus groups make clear that this kind of thing is not pleasing to paying subscribers, or could elicit cancellation attempts from the Junior Volunteer Thought Police of low level workers/true believers.
Generally, when NYT reporters drop undermining facts into the second half of the article they don't spell out that they debunk the impression given by the first half of the article. I often wind up saying to myself when I get toward the end of an NYT article and finally read some key facts, "Oh ... so _that's_ what's going on! Now it all makes sense." But I doubt if many other people notice this pattern.
I definitely notice - I think lots of outlets do it. I used to assume it was due to cut-and-pasting from different wire services. Small papers do it too. I think the WSJ does it less often. The Atlantic starts dropping things in earlier but spends fewer inches on it.
It's not at all rare to find contrary evidence in a news story. How it typically happens is for the main purpose of the story to get stated in complete form, with supporting evidence, and then a small "Congressman Bob [from the opponent's party] said that it wasn't true," and then not offering much or any supporting evidence on that side.
We typically call that "spin" and it's definitely related to the overall topic, though not as severe. Spin has existed forever, but the outright lying (directly or through obvious omission) is either newer or more pronounced than it used to be.
There may be another very important reason to include that material, especially in a part of the article less likely to actually be read. By doing so, the Times can accurately claim that they presented evidence to the contrary and a more complete story. It's similar to printing a tiny retraction on page 10 to a false front page story. They can accurately state that they printed a retraction, even if a much smaller audience actually read it.
I'd consider CYA inserts in NYT articles to be a different class of things: e.g., "A spokesperson for the Tobacco Lobbyists Association denied everything."
I'm thinking more of where you get told something in the 14th paragraph that causes the scales to drop from your eyes: e.g., you find out the female Linux expert whose hobby is memorizing which ways putts break on every green in the World's Top 100 Golf Courses, which proves that women are just as good at 3-d cognitive visualization, used to be a man.
If they don't mention the inconvenient facts at all, they leave themselves wide open to accusations of being biased.
But if they can retort "but we did mention that, look, it's right here in the article, you just didn't read it!" you have to get into a much muddier discussion about how misleading the headline + opening paragraph are when the facts are mentioned later in the article, and whether it is or isn't reasonable to expect all readers to read the article all the way through.
I’ve seen similar fact switches in the Economist too. The article will start with a very pro market, soft economically right view that won’t challenge any executive reading it, and is basically true. Then the last few paragraphs will show the complexity and nuance, and indicate the need of a regulation or other more left wing intervention to create the best outcome. In this case I quite like it, but then I like the Economist because it likes nuance.
When I subscribed to The Economist in the early 1980s, I was wowed by those big 20 page long super-articles in the middle of the magazine on one general topic. But the short articles on US gave the impression of having been worked up by clever young Oxbridge grads who, despite a better way with words than their American counterparts, didn't really know much about America.
Agreed, the Economist is my favorite news source for very similar reasons. The number of times I have raged at a headline and then been feeling more charitable by the end of the article, as they add the "well but alsos" is very funny. I also like that they are a bit more open about their biases - there are more naked values and judgment statements in their writing than in most general world news sources. It makes it easier to spot where the just-the-facts part ends, and where the Economist-editorial-position begins.
I used to like the Economist and read it very regularly for almost 5 or 6 years until I experienced something like the Gell-man amnesia effect. On things I knew really well I started realising that their reporting was consistently wrong or ill- informed, and that made me realise I should probably value the rest of their magazine a lot less.
That is most definitely true. I'm reminded of the "pyramidal style" that we were told ages ago is the right way to write a newspaper story. Only in these strange days, a combination of rabid top-line tribalism *but also* a fact-checking ability afforded to the generic new consumer that dwarfs that available in any other age, and that severely inhibits outrageous falsehood, means you have a new style in appearance for some time now (which you already described), in which the glutinous starchy base of the article, further down, can almost contradict the sweet sugary apex at the top, which is what the tribalist subscriber base can be assumed to bite off to chew.
It's definitely a little weird. You end up getting to the end of the story and thinking "whoa! did the same guy write these last 6 paragraphs as wrote the first? No way!"
I wonder what it's like to *be* that person, though. Have they made their peace with it, to pay the mortgage? Is it enough that some minority of well-informed people read all the way to the bottom, and they know that?
I think a lot of them are thinking about the clicks, that is they write the last 6 paragraphs, and then decide what angle they are going to take to get clicks on the article.
I suspect they are operating in a world where "everybody knows" that the reality is in the last half of the article and the headline, lede and first paragraph are really just advertising for the article and - because they are advertising - can say anything that isn't literally untrue.
Sure, they *have* to be thinking about the clicks. The Internet has done its usual job of savage disintermediation, and you can no longer live an ideal journalist's life swaddled in the bowels of some enormous corporation that earns big bucks from the classified section and therefore can indefinitely indulge your wish to spend a working life reporting on true things that most people find dull or mildly offensive. The field is contracting, and everyone's got to be his own brand now, an entrepreneur, selling the product first and delivering news second.
I came across a series of Youtube videos recently that were an interesting illustration of the problem. They're by a young (by my standards so early 30s) American woman who *looks* clearly American -- tall, blonde, round-eyed -- but who speaks very good Chinese and Japanese from having studied and lived in the Far East for almost 10 years. She made a bunch of videos in which Chinese or Japanese are startled when the American blondie can understand and speak to them in their native language -- great fun, and they attracted a huge number of clicks. Which led her to think she could make a living making "an American in Japan/China" videos, but then she found out that when she made videos delving into the nuances of cultural adaption her audience was like meh -- hey, do more of that thing where the waiter gasps when you order chow-fun noodles in perfect Mandarin! Those are a hoot! She's smart enough to realize she needs to market herself first, because the bills have to be paid, but she also wants to not be stuck in the functional equivalent of funny cat videos shtick, and she's wondering how to square that circle.
I'm always a little bemused that there are so many people who are shocked and surprised that this state of affairs exists, though. ("Journalists today! My God, all they think about is pandering to their audience!" "Scientists! All they think about is appealing to granting agencies and the peers who will review their next grant application!" "CEOs and other corner-office cowards! All they think about is how to appeal to this or that customer demographic of which there may be millions but which I personally find regrettable!") As if any of these things is weird and unnatural, instead of how humans have operated since Ramses II assumed high office.
I tend to attribute it to the fact that so many modern people have spent big chunks of their life embedded in vast organizations, like the Times' reporter in the Times's heyday -- in school, or working for enormous corporations. Sort of a modern feudalism. Having less experience of what it's like to *be* an entrepreneur, or work for a small business, or work in sales, where Always Be Closing is Job #1 and you better never ever forget that, lest you have to borrow from your mom to pay this month's electric bill, they seem strangely unaware of this gritty reality experienced by legions of their fellow citizens. (And contrariwise, the people who are living the life of constant personal brand-building are finding the viewpoint of those embedded in giant orgs also bafflingly alien.)
Supposedly, reporters used to start articles with the five W's pertaining to the subject matter - i.e., Who, What, When, Where, Why. The NYT's current style, however, is to lead with just one W - What you are supposed to conclude. I especially like how they tell you this by cramming in faux context with "amid . . .," and faux causation with "following . . ." And then, in case you are really dense, an "experts say . . .," to hit you over the head with the message.
So the article will read something like: "An Asian person was assaulted yesterday. The assault occurred amid a rising tide of media reports of anti-Asian hate crimes following Donald Trump's use of the racist, xenophobic phrase 'China virus.' Experts say that such hateful comments trigger anti-Asian racial hostility and violence by white supremacists . . . . ." Blah, blah, blah for 44 paragraphs, then at Paragraph 45: "According to police reports the perpetrator was a homeless man named Deshawn Abdullah Jackson who has mental health issues and a history of making assaults in the area."
You're also assuming a false dichotomy here. It isn't inherently incongruent to be anti-police, think the BLM protests were good, dislike "wokeism", recognize complex unintended consequences and other biases- if you have an "either or" prism of looking at these things, that affects the way you will read bias.
I would like Lincoln more if he were friends with Marx. It would show he considered different opinions to his own and was humble enough to discuss ideas he disagreed with.
I disagree strongly with the characterization of the Swedish study. The study really did focus on immigration status as the most prominent result of the analysis.
In particular, Scott claims that, according to the linked article, immigration status was not "a particular focus of their study" and that "although it wasn't a headline in their results, you could use their study to determine that immigrants were responsible for a disproportionately high amount of rape in Sweden."
I went and looked up the original article and skimmed it. Here is the first paragraph of their results section:
"Results
Descriptive data
Between the years 2000 and 2015, a total of 3 039 offenders were convicted of rape+ against a woman (Table 1). The majority of the offenders were men (n = 3 029; 99.7%) and the mean year of birth was 1976 (SD 12.3). Close to half of the offenders were born outside of Sweden (n = 1 451; 47.7%) followed by Swedish born offenders with Swedish born parents (n = 1 239; 40.8%). A relatively small part of the cohort was constituted of offenders being born in Sweden with at least one parent being born outside Sweden (n = 349; 11.5%). Table 2 shows from which regions the first- and second-generation immigrants and their parents originate from. Among Swedish born offenders with one parent born outside of Sweden (n = 172), the foreign-born parent was mostly born in Western Countries (72.7%) followed by Eastern Europe (11.0%). Regarding Swedish born offenders with no parent born in Sweden (n = 177), a high proportion of the mothers and fathers were born in Western countries (40.7% and 33.9%) followed by the Middle East/North Africa (19.8% and 24.0%). The largest group of the study population was found among offenders born outside of Sweden (n = 1 451); a significant part was from the Middle East/North Africa (34.5%) followed by Africa (19.1%)."
I think this is the definition of making something a headline of one's results. One of the most prominent pieces of information in the results is the breakdown of cases by immigration status. It specifically says that more offenders were born outside of Sweden than born in Sweden to Swedish parents.
It looks like this mischaracterization was not present in the news article that Scott linked, which discusses this research paper. In that news article, they specify (with quotes from the authors) that the original purpose of the research was not to focus on immigration status, but that it was something they discovered by chance while doing the research. In particular, the claim that immigration status wasn't a headline of their results seems to have been introduced by Scott.
I don't know whether Scott had access to the original research paper - I couldn't find a freely available copy of it. However, this same highlighting that I quoted above is also present in the abstract of the paper, which is freely available. Here's the relevant content from the abstract:
"A total of 3 039 offenders were included in the analysis. A majority of them were immigrants (n = 1 800; 59.3%) of which a majority (n = 1 451; 47.8%) were born outside of Sweden."
I don't disagree with Scott's overall point, which is that the researchers face repercussions for their findings, repercussions that they likely would not have faced had they found the inverse conclusions. But I strongly disagree with the implication that one would have to go out of one's way to use their study to determine that immigrants were convicted of rape at a disproportionate rate. It makes it sound like the scientific establishment is raking through papers only tangentially related to this topic to find people to crush, and that's just not what happened.
In a way, finding this small but important inaccuracy in this essay drives home the overall point of this essay. It's necessary to distrust every source, to the extent that they're willing to stretch things or not double check things or generally be unreliable. And that applies to this essay as well.
Nope, they investigated a lot of things, and it turns out one of the largest factors in rape offending was immigration status, literally a majority of offenders were immigrants, so it gets reported on first. What's the alternative, list everything in alphabetical order? The convention used here is extremely common in scientific papers. Scott is right, it wasn't a focus of the study. The study wasn't "Do immigrants commit more rape?".
Look at this abstract (I excluded the last sentence discussing the results). If you didn't know the results of the study, would you call this a study focusing on immigrants? Of course not!
Abstract
Sweden has witnessed an increase in the rates of sexual crimes including rape. Knowledge of who the offenders of these crimes are is therefore of importance for prevention. We aimed to study characteristics of individuals convicted of rape, aggravated rape, attempted rape or attempted aggravated rape (abbreviated rape+), against a woman ≥18 years of age, in Sweden. By using information from the Swedish Crime Register, offenders between 15 and 60 years old convicted of rape+ between 2000 and 2015 were included. Information on substance use disorders, previous criminality and psychiatric disorders were retrieved from Swedish population-based registers, and Latent Class Analysis (LCA) was used to identify classes of rape+ offenders. A total of 3 039 offenders were included in the analysis. A majority of them were immigrants (n = 1 800; 59.3%) of which a majority (n = 1 451; 47.8%) were born outside of Sweden. The LCA identified two classes: Class A — Low Offending Class (LOC), and Class B — High Offending Class (HOC). While offenders in the LOC had low rates of previous criminality, psychiatric disorders and substance use disorders, those included in the HOC, had high rates of previous criminality, psychiatric disorders and substance use disorders. While HOC may be composed by more “traditional” criminals probably known by the police, the LOC may represent individuals not previously known by the police.
Umm ... yes they do. I ran into a fellow at a research station, who was studying some bug in the desert to prove climate change. He had his paper existing in his head before he started his data collection. Just look at the number of students who have a degree in global warming.
Obviously you know roughly what you are going to do before you conduct the study, but you don't write the abstract. You do that when you are actually writing the manuscript after the study is complete. For the last paper I wrote, the abstract was _literally_ the last thing we wrote before submission.
Source: published scientist with lots of other published scientist friends.
The paper "existing in his head", is not the same thing as the paper, or the abstract, being *written*. Yeah, the guy probably had a vague concept of how the paper was going to flow and what the conclusions were likely to be, but I doubt he had more than three words strung together in his head.
And if he did, memory is *extremely* mutable. By the time he'd finished analyzing the data, he'd have an abstract "existing in his head" that's a good match for the data, vaguely similar to the abstract that existed in his head at the start, and he'd believe the two were nigh-identical.
Dangerously Unstable is right. Almost nobody writes the abstract before they've completed the research. And if the abstract isn't literally the last thing written, it's because the submission deadline for the abstract comes well before the submission deadline for the paper - it takes effort to write something brief and accurate, so writing the paper serves as a rehearsal for writing the abstract.
Also a published scientist with lots of other published scientist colleagues (and just reviewed one of their abstracts this morning, written first because of submission deadline).
As someone who is a journalist and a fairly close follower of Swedish debates on crime and immigration (I lived there for seven years, for three of them as a working class immigrant in what is now a ghetto but then wasn't), I think Scott is 90% right, but missing one important journalistic skill, which is that we know which experts to trust, and how much to calibrate in each case what you might call the Pravda factor.
You have to remember that no expert or insider will tell the whole unvarnished truth in public except in very rare cases. This is normal and natural. Either they will be misunderstood, usually deliberately and often by their own side, or they will be ignored.
But it you're lucky, and have something to trade, and if they have learned that they can trust you, they will talk much more honestly in private. Given that the Swedish debate about immigration and crime is so inflamed, and the public story so very different from the things people assume in real life, the first thing I'd do is ring up a criminologist friend and ask if this story is bullshit. That would be off the record and it would have to be. Unless they felt there was a huge injustice going on, taking sides publicly would be as pointless as joining in a twitter spat.
What they told me would feed into what I then wrote. But now we're up into double layers of trust. The reader has to trust that I have a trustworthy source. Why should they? Readers don't on the whole interact with individual bylines enough to establish a relationship of mutual trust. So Scott's original heuristic is about right.
But it does lead to a genuinely damaging situation in which (to speak from experience) a Guardian executive will say "We can't use that quote because the Mail would love it". And, presumably, vice versa.
"one important journalistic skill, which is that we know which experts to trust, and how much to calibrate in each case what you might call the Pravda factor."
Do you though? I doubt this assertion and think that it is closer to confirmation bias than a skill.
I mean Tetlock has found the experts that do go on the news to be worse than those who do not.
If you understand Norwegian this show talks to a bunch of journalists and gets the point that they choose the ability to easily convey their meaning over knowledge in the subject matter:
In other words, journalists optimize for a good interview, not an accurate interview. In addition, they like the ones that are known and will use people who they know say yes and are reliable over finding an actual expert in the field.
I can't have made myself clear; I'm sorry. A public interview is never just, or mostly an exchange of information. It's a performance for an audience. It is always choreographed and usually edited. But the conversations which inform us most are those which are held just with the source, with no readers or listeners overhearing. And it is in the nature of honest speech between two knowledgeable people that often as much is conveyed by what's not said as what is. That aspect is obviously impossible to convey in public.
(I do as it happens understand Norwegian reasonably well, though Danish is impenetrable to me.)
If one is interviewing scientists or academics, *of course* we optimise for the effect on the audience. That is because ultimately the audience or the readership are our paymasters. It is no use to anyone if you interview someone who cannot make the truth comprehensible to the audience. Translating the natural speech of an expert into the natural speech of an ignoramus is the core arbitrage performed by any specialist journalist. This is much easier in print than on radio and hardest of all on live television. So, yes, there is a natural bias towards fluency at the expense of thoughtful understanding. The point I was trying to make is that good journalists are aware of this, and make allowances.
To give a concrete example: I have interviewed both W.D. Hamilton and Richard Dawkins. There is no question which was the greater scientist, but if you're looking for someone quotable it's Dawkins every time.
Some lines the media already crosses are pretty far out there. No idea about FOX, but I do know that France 24 or TV5 Monde can quote a foreign figure saying X and "translate" that as saying the opposite of X. Or show footage of NATO tanks and imply those are from a non-NATO country. Source: a relative who watches France 24 and TV5 Monde and knows both languages.
Yes, unfortunately my prior on "formerly respected news outlet will simply make up basic facts" has drastically increased in the past few years. I've caught the BBC flatly lying about basic facts in the form Scott claims FOX would never do, several times now. A few examples:
1. Some friends sent me a TV segment about COVID as a way to "prove" I was wrong about something. The segment had an interview with a woman who was introduced as a "Dental specialist", the idea being that the NHS was so overwhelmed it was having to recruit people from other medical fields to serve on COVID wards. The women didn't sound anywhere near confident enough to be an expert on anything, but fortunately had a rather unique name, so I quickly Googled her. She's actually a social worker who tries to get prostitutes and drug addicts to go to the dentist.
So - the BBC will lie about facts like what jobs people have.
2. They wrote an article about a vote in Switzerland, again related to COVID measures. It presented an extremely amateurish hand-drawn cartoon image and claimed this was "an ad by the Swiss yes campaign". But there wasn't any yes campaign in this case, so I reverse image searched the picture and found it came from an article about one man who rented a couple of billboards for a couple of hours to troll some protesters, because he was upset there wasn't a yes campaign. The image looked amateur and hand drawn because it was.
The same article had a graph of COVID cases with weird drops and spikes in it for Switzerland, but no other country. The caption of the image claimed this was due to delays and data errors by the Swiss government. If you're thinking that doesn't sound very Swiss, you're right. I checked and the data errors were introduced by the BBC, the Swiss COVID dashboard didn't have them.
So - the BBC will lie about things like the existence of entire political campaigns, and even graphs of government statistics cannot be trusted.
3. The BBC likes doing vox pops with people introduced as "nurse", "doctor", "teacher", "professor of X" etc. At some point it was noticed that these people would attack the government and nasty Tory party far more often than you'd expect given the voting habits of the general population. A website called Guido Fawkes started checking the background of these people and discovered that staggeringly often, they were Labour Party activists and this wasn't disclosed anywhere. At one point a COVID related Panorama special was broadcast in which every single "expert" turned out to have engaged in public left wing activism, and some were actually attending/speaking at Labour Party rallies:
So - the BBC is willing to present people as neutral experts when they're actually party political activists, and not tell anyone that.
There are many other examples that could be listed here but unfortunately I've now learned that actually, TV journalists WILL lie about basic facts like numbers, dates, job titles, events in foreign countries. The issue is not one of mere bias or selective presentation of facts, but that even the most basic claims about the most objective things cannot be taken at face value.
From the other side, some viewers of the BBC's Question Time spotted that a particularly distinctive member of the audience kept appearing in different instances of the programme held in different places. It turned out that the programme managers were encouraging people from more right-wing groups to attend the broadcasts in order to counter what they perceived as a liberal bias amont the people who applied for places in the audience, and to create more on-screen argument.
Neither practice is admirable, but I mention this one to counter the notiion that the bias is all in one political direction.
That, or the BBC has a very strong idea of who they want to be their "controlled opposition".
You want a combination of people who either say "Well I usually support the Tories but I oppose this thing the Tories are doing" or else people who are complete and obvious nutters and will say something stupid.
I think you're reading too much into this. The BBC comes under significant criticism from the left as well as from the right. The current director-general of the BBC, Tim Davie, used to be a conservative councillor and was deputy chair of his local constituency party. It's a large organisation with many groups and sub-groups, and it also funds some of its content from external organisations. It's highly unlikely that there is a controlled and deliberate disinformation campaign across the whole organisation.
This would be a lot more convincing as an argument if the media didn’t frequently make up events that never happened or lie about them in hugely significant ways. You e covered many such events in the past in your blog as well, which makes me wonder if those all somehow fall into lies no smart person was expected to believe, even if those lies launched wars where millions of people died.
Recently Rolling Stone Magazine made up a story about ivermectin poisoning cases causing gun shot victims to be unable to get into hospitals. This was picked up and repeated widely in the left media despite having no basis in reality.
Not to mention the clear lie about horse dewormer which is utter nonsense concerning a drug on the WHO essential medicines list whose finder won a Nobel prize for its discovery and is an approved drug for humans in every single developed country in the world….are you telling me they didn’t know that they were lying as they cashed those Pfizer advertising checks and listened to their board of directors, some of whom also sit on Pfizer’s board? Is this outright and intentional lie equivalent to making up false footage of a mass shooting …is this not the sort of totally made up nonsense reporting you’re talking about them not doing! Because they are doing it anytime they feel like it.
And what of the evidence free Russiagate story while they ignore what google and Facebook have done to interfere in the election?
What about something simple like how Rodney King’s name isn’t Rodney King? They couldn’t get his name right in the reporting and stick with their error.
These are higher profile cases and lend themselves towards controversy, but even with various other stories reality gets twisted to such a degree thst it may as well be made up. Anyone I’ve met who has been part of a news story has said that the reporting was a lie and a misrepresentation of what went on.
Science reporting is a favourite point of malfeasance and making up nonsense to pretend s study says the opposite of what it actually says. Often with the scientist telling the reporter over and over again that they’re wrong. There is no world in which those science journalists don’t know the truth and limits of the study when they’ve spoke to the corresponding author, yet they’ll just make up lies as they see fit and they do it on purpose.
Is this 100% exactly the same thing as making up a false citation to then say whatever they wanted to say? No, it isn’t, but I fail to see a difference. If you can say what we you want, then the base reality event is just random cannon fodder for the lie machine, even if you can squint at the gruesome chunks of flesh and occasionally make a guess at reality. If you want to say up is down, it isn’t hard to find some loosely related ‘up type’ event in reality you can twist.
Your own experience with the NYT is proof enough of s low stakes case where pigheaded reporter and editors simply wish for whatever reality they want snd do whatever they want with information.
Does it matter if half the media lie and the others do not? Is it somehow better if both fox and msnbc go along with Bush and Powell about nonsense yellow cake and connections to 9.11?
Was a reporter not caught on a hot mic talking about how she has the Epstein sroey years ahead of time and had to suppress the story because of management who didn’t want to get cut out from the royal baby wedding coverage? Or how they all lie about his ‘suicide’ that was clearly not a suicide? We just nod along and go yes yes..we smart folk know he was obviously some intelligence agent taken out by his handlers.
In any given year many false, made up, and non factual stories run and they’ll be a mix of them doing these things on purpose, simply picking up propaganda and running with it, and the mightiest tool of censorship being non coverage of stories to devalue them. Along with early reports, rushing, and just plain being wrong.
But they definitely make up things out of thin air when base reality fails to provide them an excuse they can use to say something else entirely,
"Not to mention the clear lie about horse dewormer which is utter nonsense..."
Well, it is a horse dewormer. It's not JUST a horse dewormer, but it is a horse dewormer. This is exactly what Scott means by being technically right and deceptive at the same time. CNN didn't say, for example, that ivermectin was used to kill people in gas chambers, which would be an outright lie.
Let's push the line a little more: what about the reporting that Joe Rogan and others who were *prescribed* Ivermectin took horse dewormer. How far into "deceptive" can we go before we can call it a lie?
Your example lie is clear, but it's very far from the line. I think there are legitimate examples like this which cross the line into lies.
I take the “horse dewormer” stuff to be the same sort of statement as “our competitor fills their produce with artificial chemicals while our stuff is all natural”, where it’s clear that what they are saying is likely 100% technically true but completely beside the point and just designed to make the other guy look bad.
I don't think it's the same, and the difference is because "artificial chemicals" and "natural" aren't strictly defined terms, they are vaguely defined categories that the reader is supposed to interpret.
By contrast, "horse dewormer" is literally a class of product you can buy, therefore claiming someone is ingesting horse dewormer is clearly asserting that they are ingesting medication *intended* for horses, and not one for human consumption.
I think all of these are pretty straightforwardly defined terms, and the issue is people using them for the wide penumbra of "technically correct" uses rather than the paradigmatic uses the term is intended for.
Agree to disagree then, because I don't think those terms are straightforward to define in a technically correct sense.
I also disagree that the statement "Joe Rogan took horse dewormer" is technically correct. "Ivermectin" is neither denotationally nor connotationally equal to "horse dewormer", so in what exact sense can you claim that that statement is equal to the actually true statement "Joe Rogan took ivermectin"?
It's definitely a strawman a lot of the time, but it's also born out of the fact that in rural states like mine, there are folks going down to the feed and seed and grabbing literal ivermectin packaged as horse dewormer. I think there's a difference between believing Ivermectin can help and searching out a doc that will prescribe a regimen of the formulation designed for people vs going out and grabbing veteranary medicine. There was a hilarious thread in a local fb group the other day, where some dude and a few of his adherents were loudly claiming something like: "Vaccines are stupid when you can go get ivermectin at the feed store and most I know are cured in 24 hrs"
To me the usefulness of "horse dewormer" is that as soon as someone says it, they are clearly outing themselves as having a particular political bias which is then helpful to put everything else they say into context.
If one is a scientist or doctor who has concerns about the use of ivermectin for Covid, then there should be no need to resort to "horse dewormer" to make a case. If "horse dewormer" is the best someone's got, then they don't actually have a case to make.
I have no position on the use of ivermectin pro or con, but I really don't like being emotionally manipulated.
I see what you are doing here. You are claiming it is not a horse dewormer to give an example of the kinds of lie that the media often does. Genius!!! \s
I think OP is also indulging in a bit of fact-twisting for effect here, intentional or not. Rodney Glen King was known as Glen by his friends, that's all. Not a particularly remarkable 'error', if it even rises to that standard.
This is an interesting blog post. It also has theoretical potential, when it comes to elaborate and fine-tune a theory of human interaction as such.
You are essentially at the intersection between signalling theory and semiotics. Which in my humble opinion is “where the action is” in the human sciences today (and probably in the life sciences more generally).
What you are describing is the way principals (defined as actors in a coarser information position, in this case: those who read and watch the media) try to screen messages and signals from agents (defined as actors in a finer information position, in this case: journalists and editors belonging to different news outlets) to detect which agents are trustworthy/who to trust.
... Journalists & editors send messages and signals in order to come across as trustworthy. Users of media (the rest of us) try to screen these signals and messages in order to determine who we can trust, and who not to; including when we can trust messages sent by those we normally do not trust, and when to be sceptical toward messages by those we normally trust.
One of the (interesting) points in you blog post is that some principals are better at screening such messages and signals than others, including that those who are less good may (rationally) adopt cruder strategies in lieu of fine-tuned screening abilities, such as “trust nothing from news source X”, or “trust nothing except from your close circle of friends and relatives”. And then you try to suggest some kind of demarcation criterion to use, to improve your screening skills. Again, theoretically interesting – and of applied interest as well!
…you might also have the embryo here, of a strategy of how one might potentially establish what a commenter to a previous blog post labelled “the inner party”; i.e. how to solve the very difficult problem of creating a circle of people who are able to subtly signal what to believe and what not to believe to each other, while at the same time being able to collectively maintain signalling something different to the “masses” (the great unwashed). Re: you story about good versus glorious harvests in good-old USSR-time Pravda. Hmmm…
…Some classic essays and articles come to mind here: “Trust in signs” by Bacharach and Gambetta; “Trust as a commodity” by Dasgupta; “Strategic interaction” by Goffman.
Elaborating this type of insights into fine-tuning a general theory of signalling & semiotics, I suggest the following one-liner as the overarching premise for this general theory: “We are all principals when observing others, and we are all agents in the eyes of others”. Meaning that “we are all in a coarser information position when observing others, and we are all in a finer information position when being observed by others….”
Good in general but totally fucked about the 2020 election because the WaPo and similar sources were not in a position to KNOW whether there had been well-covered-up fraud AND THEY DIDN’T WANT TO KNOW.
No need to get into the details here to try to persuade you about fraud in the 2020 election, just telling you that it is a really bad illustrative example. Furthermore this IS the kind of thing they would lie about for the same reasons they spiked the Hunter Biden stories they knew were probably true.
I do NOT want to get into this now, but know that I have been working professionally as a consultant in the field of elections since 2002 and I have seen several elections stolen in precisely the way this one appears that it may have been stolen (with some additional twists related to it needing to be done in 5 or 6 states at the same time and a whole lot of high-level maneuvering related to blocking scrutiny). Although I won’t go into detail, it IS certain that illegal destruction of evidence that would allow a definitive answer to the question has occurred (which is not to say whether enough evidence remains to eventually arrive at a definitive answer). You are (1) correct about the insecurity of the system (2) perhaps less than fully aware that the insecurity is built in on purpose because it functions, like gerrymandering, more as a bipartisan incumbent-protection scheme than for partisan advantage (3) wrong in this case about the parties having equal opportunity, this required a kind of coordination that was only feasible because of extremely good media control (along with the existence of key Republicans in 2 states and 1 news network who preferred that their party lose than that Trump win).
Note: this only works when carried out by insiders in cities where one party is dominant so everyone around in any official capacity is sympathetic:
1) change rules to have mail-in ballots for as many people as possible
2) create fake voter registrations and addresses so ballots will be mailed there and can be filled out and sent back
3) on election night refuse to scrutinize signatures, pass everything through, while hindering observers from the other party and locking them out of some rooms where you are counting
4) if late on election might it appears that you didn’t steal enough votes this way, “find” more by introducing stacks of identically marked ballots with no chain of custody documentation to run through the machines
5) cover your tracks by combining groups of ballots that are supposed to be kept separate and other “oopsies” no one is prosecuted for, deleting logs and other data from machines, etc.
> create fake voter registrations and addresses so ballots will be mailed there and can be filled out and sent back
Okay, those lists are public. I get that local Joe might not have the money or capacity to do that, but professional GOP poll-watchers have no shortage of demographic information to compare against to find fake names and addresses.
Of course they did. They did not have the resources to look at the entire state databases, but the studies that did a random sampling of them found that a few percent could not be verified: the issue, of course, is that there is no way of telling which candidate those unverifiable registrations actually “voted” for.
The irony of this comment being made on a post about people who fail to figure out when they're being played for a sucker and so end up stanning conspiracy theories...
I am trying to figure out a way to ask this without sounding insulting, but - this was on purpose, a troll, right; you can't really have this little self awareness, can you?
I didn’t say it has been proven that that election was stolen. I was *careful to say* that it had not been proven that it hadn’t been stolen and that there was not justification for Scott’s talking as if that had been proven that it hasn’t been stolen. Furthermore, as a professional in the field, I was very familiar with all the evidential issues on both sides, and was only saying don’t jump to conclusions because there has been a lot that has been covered up and not yet fully examined. I’m very suspicious based on professional experience, but I don’t want to hijack the discussion on this unrelated issue, which can’t be conclusively settles right now anyway, so I just criticized Scott’s casual assumption that the matter has been settled (and also that the boundary of what mainstream media would fail to report properly excluded this).
What's your perspective on the $300 million Zuckerberg donation? Was it a good-faith effort to help counties genuinely in need during the election? Or was it a play to help 'count the vote' from a known partisan funneled almost exclusively into known swing districts with the aim of influencing the outcome?
Both! For public consumption the former, but because of the actual identities and agendas and lack of principled neutrality of the specific individuals who were tasked with spending this money, it worked strongly in one political direction.
I'm not close to this, so maybe I'm wrong, but it's strange to me that one of the richest people on the planet giving hundreds of millions of dollars to help 'count the vote' right before the election isn't one of the biggest news stories ever. Especially when that billionaire has known political biases.
Not to mention the cast of billionaires in the richest companies in the world all simultaneously deciding (without collusion?) what election-related discussions are allowed across the most popular platforms on the internet.
This reminds me of banana republics that have 'free and fair elections', except that the challenger has to do his campaigning from prison and the incumbent must be trusted to count the votes and report accurately. The forms are all there, but something seems a bit off.
And the biggest red flag is that I'm not allowed to talk about it or I get banned from everywhere. Claims of a stolen election are a time-honored tradition in American politics, going back centuries up through the 2016 election. I didn't question the general validity of this election until I was told I'm not allowed to talk about it. That's when I started to wonder whether something might be going on here.
Yeah, I'm not a "election truther" or anything of the sort, but the implication of the section seemed to be when WaPo tells you there's "no evidence" (!) of election fraud, that should be as convincing to you as when Fox News tells you there's been a shooting. The latter is a positive statement about a simple reality, while the former is a negative statement about a much more complicated system.
I don't know that it hurt the overall article's effectiveness, but I think that example was a swing-and-a-miss.
The "no evidence" in the 2020 election fraud claims is in the category of "no evidence even though everyone was looking really really hard." Republicans found things that they thought were evidence of fraud. They took that stuff to court, often in front of Republican-appointed judges or in states where the Secretary of State was a Republican. The judges, almost unanimously, said "no, this doesn't look like fraud." To me, that's sufficient evidence for a newspaper to report "no evidence of large-scale fraud was found in the election."
You can't just ask yourself "do I think this source has a motive to lie?" and stop there - with enough motivated reasoning, you can always think of a motive for lying on any subject. You have to ask "do I think this source has a motive to lie, to lie about the existence of whatever evidence they cite to support the lie, and accept the risk of being confronted by adversaries who want to catch them in a lie?"
If someone believes the moon landing was faked, you should ask why the Soviets (who can easily track NASA's launches) didn't reveal the fraud. If you believe that there was election fraud in 2020, you should ask why the party in power was unable to get any of their claims to stick in court. You should ask why all of the people claiming "obvious, incontrovertible fraud" when talking to reporters suddenly retracted down to "well, you can't prove they *didn't* sneak extra ballots in there" when the time came to actually make legal claims.
When I see them say "no evidence" I trust them less because it's overstating the case and trying to snow me.
There are a smattering of small things, which if all taken together in one place wouldn't be enough to flip even one state.
So that's "minimal evidence."
. . . And looking back at the headline, they say there wasn't "significant fraud." The phrase "no evidence" didn't appear at all. God damn it, I wasted my time arguing against something that didn't exist.
Sorry, I didn't mean that to be a quote from the article. But that said, I think this is splitting hairs. There was "minimal evidence" the way that there is "minimal evidence" that homeopathy works, or that psychic powers exist. That is to say, there was evidence that could be convincing at a glance (especially if you were motivated to believe it), but when looking more closely turned out not to be anything meaningful. There were things like "a whole bunch of D ballots all got counted at once in some states" which turned out to be an artifact of the counting process, for instance.
There were also a few people who got caught trying to vote twice, which happens every election, which I suppose counts as "election fraud" but has basically no connection to the Big Lie.
"That is to say, there was evidence that could be convincing at a glance (especially if you were motivated to believe it), but when looking more closely turned out not to be anything meaningful."
This is one of the fact-checkers' favorite Newspeak tropes. Translation: "Yes, your most basic, fundamental senses may lead you to believe something, but let me put some context on it for you. Let me provide *nuance*."
"You may see the fires and hear the explosions with your own eyes and ears, but let me tell you why that's a peaceful protest."
It's a form of gaslighting. They want you to doubt even your most basic senses and rationality and just defer to them for the Truth.
But is it your opinion that first impressions are always correct, that nothing is nuanced, and that context never matters? If not, there must be some cases where the "Newspeak trope" is correct, no?
Really, that's what you're going with? "My first impressions are always right, anyone who disagrees is just gaslighting me"?
Remember that "too good to check" story, when a doctor said hospitals were overwhelmed with people taking ivermectin and making themselves sick? Was Scott gaslighting us when he gave us several pages of investigation into how that claim came about, why a journalist might have thought it was plausible but it turned out to be false?
This is my explanation for why Zeynep Tufekci, a sociologist who studied the role of social media in real-world phenomenon, turned into one of the most prescient COVID pundits from the very start: she has no microbiology background, but she has a finely-tuned sense of who is playing politics with the truth, and which ideas are being brushed aside for reasons besides validity.
Indeed, reading that section of your article (noticing “good rather than glorious”) had me wondering if it was a direct allusion to Dr. Tufekci, and her own professed skill in that department (as described in the [edit: article below])
Ah thank you for posting. *that* is exactly the article I was thinking of, not the one I actually posted. I vaguely recalled Tufekci writing about this topic well but misremembered in which article.
The children of narcissists and addicts also get very early training in bullshit detecting (I speak from experience, also as a therapist). It's kind of like learning a language naturally -- it's not the same if you have to learn it later running it through more conscious channels. (I don't know anything about Tufeckci's childhood, so don't intend any commentary there). Maybe there's a genetic piece too, like supertasters.
News media outlets have a lot of discretion over what is news and what is not news. Obviously, wars, stock market crashes, blizzards, etc. are going to make the newspaper. But in a country of 330,000,000 people there is always more potential news to report upon than there is space for it, so judgments must be made.
For example, the New York Times, which traditionally strongly influences the rest of the news media, finds the rather dusty story of Emmett Till, a black youth who was murdered in 1955 by whites, to be worthy of constant coverage. The name "Emmett Till" was mentioned in the NYT in 57 different articles in the last 52 weeks, and in 407 articles since 2013.
The once-a-week invocation of Emmett Till serves the Times' purpose of encouraging readers to believe the Narrative that blacks are in grave danger of being murdered by whites. Granted, somebody with good critical thinking skills might notice that if you have to keep bringing up a 67 year old incident to serve as an example of your statistical hypothesis, you might not actually have a strong case. But most New York Times readers are more in tune with the mood music than with the data.
In contrast, the New York Times does not much at all like to report on black-on-white violence, treating it as distasteful police blotter items of only local interest. Not surprisingly, readers of the national news thus tend to get a highly lopsided and biased view of the criminal justice system, with disastrous consequences, such as the historic increases in murders and traffic deaths since the declaration of the racial reckoning two years ago.
I'm going to not ban you for this because honestly I started the talking about the way the media reports race and crime, but maybe limit yourself to doing this kind of thing once per comment section?
Posting Sailbait and then threatening to ban Steve for commenting more than once, that's cold. Is a man not entitled to an outlet besides his own 23,529 blogposts?
If after instituting this rule you mention black women's hair issues in a blogpost you'll probably kill the man.
If you want to give Steve a taste of his own medicine why don't you post a "much more than you wanted to know"-comment on, I don't know, an epidemiological CDC data-mystery? He won't know what hit him!
While this is certainly your blog, and you can ban whoever you want, his point here seems incredibly relevant to your post. I had no idea the Times mentioned Emmett Till so often, and it does create a perception in the minds of readers. Being constantly reminded of an egregious injustice cannot be anything but designed to create that impression. Similarly, someone following Trump's Twitter could expect to see a lot about how bad Trump's political enemies are. If we were following Trump, we should acknowledge and adjust for that bias. Noticing the bias seems like an integral part of the process, and a rationalist should absolutely go out of their way to help recognize these biases in the news we all consume.
Someone needs to create a twitter-bot that automatically Sailer-posts the following when one of those BLM-buzznames are used:
"When white policeman Kyle Rittenhouse shot George Floyd and Emmett Till on January 6th, 1619, it was not just the white supremacist murder of two more black men, it was a lynching of *all* black bodies, which built our democracy, but were redlined out of generational home equity."
I have no idea about the context or racial issues at play here, but I think the broader point is *incredibly* germane - experts and the media do not have to make grave and deliberate errors of commission to push past the boundaries of your trust. They can make choices about what to say, how often to say it, and what not to say and how much they should not say it.
I read neither The Federalist nor MSNBC, because they both make outrageous choices about what to cover. I don't know that I've ever seen The Federalist lie, but they may as well by how slanted their coverage is. MSNBC I have seen straight out lie (or say things they really should know to be lies) as well as being deliberately one-sided to a ridiculous extent.
Knowing that the NYT reports on nearly 70-year-old news *regularly* in an attempt to rile up their readers is certainly similar, even if less egregious.
I think the NYT is quite biased, but it's not a random news story; it set the stage for lots of things. It's a piece of history, and they should reference it the same way they reference the collapse of the Berlin Wall. (That should be mentioned in a lot of stories about life in modern Eastern Europe.)
They're not reporting it like it's fresh news, are they?
I haven't read all of the stories mentioning Emmett Till, but the small sample I have seen are more along the lines of "this is like now, you should be upset" than "this is what things were like before" you might expect from a history lesson.
If a major newspaper mentioned the Berlin Wall more than once a week four a year straight, you wouldn't think that is weird and maybe putting too much emphasis on it? Mentioning it at the anniversary of its fall or something, sure, but every week?
I mean... I know you are concerned about what take-aways people will get from your comment section, but I think Steve's comment is pretty on point here.
I hope no one thinks I was telling Scott what to do. Obviously he is an expert at making an online community, whereas my blog hasn't even found its audience yet
I'm sorry, I realize that a lot of people do their best thinking in the abstract or using trivial toy examples because they are more comfortable thinking in that manner, but I do my best thinking about concrete topics of major public importance. I apologize if that discomforts readers.
I have several fairly novel insights into media bias, but they largely come from my decades observing the most important media outlet, the New York Times, spin the most controversial topics of our time such as race and crime. Unfortunately, my brain is better at coming up with and remembering ideas about topics that are important, disputed, ad sensitive, so most of my better discoveries about how the dominant modes of modern discourse fail are tied to subjects about which many people get upset learning that the media's conventional wisdom is based on fallacies.
My mental model of “lying” is the distance between what someone is saying and what they consciously attribute truth to. The longer that distance, the more lieness they have. If they just refuse to come to a conclusion, to cheat and jam the lieness calculator, I give them an automatic 30% lieness score with an “undecided” annotation.
So: “ Really savvy people go through life rarely ever hearing the government or establishment lie to them. Yes, sometimes false words come out of their mouths. But as Dan Quayle put it:
Our party has been accused of fooling the public by calling tax increases 'revenue enhancement'. Not so. No one was fooled.”
I like this, but there’s a missing piece. Substitute “fooled” with “betrayed.” Quayle says no one was betrayed because everyone understood the code. But collectively there was a betrayal of the information transfer process, by use of obfuscation. Obfuscation is always a tiny bit extra effort because it has to tilt the preferred direction. So there was effort made to present something not congruent with Quayle’s personal attribution. That’s a higher lieness score. Zero-consequence obfuscation is not a thing; if it wasn’t accomplishing something they wouldn’t do it. Maybe it was finessing attention away from the topic, make it slide by unnoticed, so whoever bothered to think about it would crack the code and not be betrayed, but more people would simply not notice?
So “really savvy people” are not experiencing constant betrayal, because they both pay attention and know the code. They may be able to change with the conditions and not sustain harm. But the lieness score for Quayle is still nonzero.
Unwillingness to score someone on lieness is not necessarily gullibility, but it is unwillingness; if I’m not betrayed either way, surely I can look at the nonzero lieness score?
“Clueless” may struggle to distinguish the code from the lieness. It may cobble together into a “likelihood of betrayal” score.
I don't think it's fair to indict Quayle for this statement. Notice he's saying "our party." This line sounds to me like a reprimand of his party for lying, wrapped up in the aphorism "It was worse than a crime, it was a mistake." He is saying, "Not only did we lie, we lied ineptly!"
I may be wrong, but more context to the line is needed.
In fact 70% of people thought Saddam directly responsible for 9/11, not just that he had WMD. That he had WMD was not a fabrication of the media but the political establishment- that or they actually believed it. The media was reporting what the administration said. I don’t remember there being much opposition to the idea that Iraq probably had chemical weapons, outside left wing anti war journalism.
Calls for invading Iraq started pretty soon after 9/11 and the media did conflate Iraq and the event pretty soon after. Bush apparently blamed Iraq within 3 days, the media implied or said outright that Iraq was responsible directly, or that Iraq helped Al Queda. 82% believed the latter. This couldn’t have just been the right wing media, it’s reach is not far enough.
I was paying *extremely close attention all the way through*, and I am sure that, although the Bush administration wished very much for the American public to absorb the insinuation that Saddam was involved in 9/11, they successfully accomplished this without ever directly claiming that he was.
Yes. Very few of the media said it directly either. However the “threat of WMD” was often shown after a montage of 9/11, or there were pictures of the twin towers burning in the background during a discussion on Iraq, or people would mention 9/11 when talking about WMD getting into the hands of terrorists.
At the time, nuclear or large-scale biological terrorism seemed like a genuine threat. Since 9/11 was a massive escalation compared to anything that had come before, it didn't seem crazy that there might be an even bigger escalation. Al Qaeda had proven themselves more capable than we expected, and the very nature of terrorism had changed from a focus on threats-and-demands to outright "fuck it let's just kill infidels". Al Qaeda of 2002 definitely _wanted_ to start nuking Western cities, and with enough nukes floating around it didn't seem totally implausible that it might happen.
Now it's 2022 and we see that 9/11 was the high water mark for terrorist capabilities rather than being a harbinger of a new era. It looks like the War on Terrorism actually did work, in the end.
There was, at the time, the reported meeting between Mohammed Atta and an Iraqi official in Prague. This was the strongest evidence of Iraqi collusion in 9/11.
Only recently did I find out that this alleged meeting was probably a mistake by Czech intelligence; the Iraqi official probably met with a different guy called Mohammed Atta.
That story disappeared very quickly after it was found to be untrue, but it may have lingered in people’s imagination. The debunking had less fanfare then the original story.
Yes. That was the CLOSEST they ever came to a direct accusation, and I remember it well, but even if it had checked out it wasn’t enough evidence to say Saddam was involved in 9/11, and it never got corroborated anyway.
I was surprised by this example in Scott’s article as well. I haven’t seen evidence that the claim at issue (that Iraq had been stockpiling weapons of mass destruction) was a lie by anyone, let alone by media outlets simply reporting the claims of intelligence communities. It turned out not to be right, but I haven’t seen reason to think that the intelligence communities fabricated it. And I would be stunned if Fox News had known it was false while reporting the claims.
Iraq had definitely had chemical weapons in the 1980s, because it used them on the Kurds.
After the 1991 war, Iraq had been banned from having them and UN inspectors had gone in to confirm this.
Iraq had been playing games with the inspectors so it was possible that they were hiding chemical weapons, but also the inspectors had never actually found any.
US intelligence had evidence that there were chemical weapons that were being hidden, but it wasn't all that certain and analysts within the intelligence community were split.
People within US intelligence, knowing that the President wanted evidence that there were chemical weapons chose to only present the analyses that showed that there were weapons and not the analyses that said that there weren't.
I am unsure how much actual lying was going on and how much people were fooling themselves.
Yeah, there's a huge difference between having a thermonuclear warhead on an ICBM and having a barrel of leftover mustard gas hidden away with no effective delivery mechanism.
This is my memory of events as well. There was legitimate reason to believe that Iraq might, and likely did, actually have WMDs at the time of the invasion. That we never found them can still register as an "oops" rather than an intentional lie, though I am sure that there were officials and people in the media who were aware of the potential we were wrong and withheld that information. I would call that a lie, to share one side of the story and not the other, to intentionally produce an outcome.
My understanding is that Iraq had active chemical weapons programs, along with R&D programs targeting nuclear and biological weapons, up until 1998 when the Clinton administration ordered a bombing campaign against where they thought the Iraqi government was hiding their WMD programs. That bombing campaign was much more effective than we thought at the time, and Saddam decided to mostly abandon further WMD programs at least until he could get sanctions lifted, apart perhaps from some small scale stuff to lay the groundwork for post-sanctions resumption.
Following this, the Iraqi government attempted to send contradictory signals to different audiences. To the US and Europe, they tried to project the accurate impression that they'd abandoned WMDs, in hopes of getting sanctions lifted. But they also tried to convince Iran and domestic audiences that they still secretly had WMDs, to deter Iranian aggression and would-be rebels and to reassure the Iraqi military that the regime had the means to defend itself. The US intelligence community then picked up on the disinformation campaign and largely believed it.
I think you are being generous to the intelligence community. It was their job to know if Iraq had WMD or not, an invasion was and is a major event - probably the defining even of that decade. Also have a look at the office of special plans.
I don't think that they fell for it entirely for good reasons: motivated reasoning and institutional groupthink certainly seem to have played a massive role.
I forgot about the disinformation scheme that Saddam himself was using. He was trying to play both sides of that, and ended up misreading how serious the US could be about invading. I think in retrospect he could have come clean about not having any WMDs and saved himself from the invasion. But, if he had capitulated to US demands that easily, he would lose face in the region and possibly have rebellions or other invasions to worry about instead.
I read more recently that Saddam had chemical weapons whose mechanical parts had broken down. Then ISIS used those chemicals in homemade chemical weapons which they used against U.S. troops. The Army then covered up the resulting illnesses because Saddam had received the original weapons from Reagan (for use against Iran, IIRC).
Wherever you read it, it was largely wrong. Iraq's chemical weapons were home made (these things aren't that hard to make) with the imported equipment mostly coming from Germany.
The Germans involved (private companies, not the government) didn't explicitly provide it for the purposes of making chemical weapons, but may have had a wink wink understanding about what they were likely to be used for. Three Germans were later convicted of export offences.
The US was angry like I'd never seen, even in the days of the Cold War, and that anger was focused on any faction that advocated international terrorism basing its enmity in Islamic tenets, wherever it was.
The Bush administration policy was consistent with addressing this anger. Seeking justice for 9/11 wouldn't be enough; the previous decades were a series of terrorist acts for which justice was sought, maybe obtained, only to be followed by more terrorism. To treat 9/11 as yet another police action was to continue the vicious cycle; US policy was compelled to address the root cause.
That root cause was widely seen as Iran. More precisely, the Supreme Leader and his supporters. There's a political cartoon out there depicting Hezbollah as a puppet operated by Syria, itself a puppet operated by the Ayatollah. All roads led to Tehran.
Trouble is, the Ayatollah was very well protected, self-sufficient, and in full control of the press that fed his support. If goal was to attack the root, the US would need to turn other Islamic entities against him. Saudi Arabia was already an ally; this is why it wasn't attacked, despite 9/11 being the brainchild of a Saudi. (Besides, bin Laden wasn't hiding in Saudi Arabia by then.) Libya was too far away, and the US needed bases in the area.
For a complex mix of reasons, the best first candidate was Saddam Hussein. Iraq itself already had reason to oppose Iran. Meanwhile, Hussein was a vocal supporter of Palestinian terrorism, even known to compensate the families of suicide bombers. Hussein was an easy villain.
There was actually a three-point case to make for an Iraq invasion: Hussein's repeated violations of UN resolutions; his human rights abuses record; his possession of WMD. For reasons I still do not understand - perhaps simplicity; perhaps belief that this had the most compelling evidence - the US focused primarily on the third.
Don’t get me wrong. The newspapers were still to blame. Their job isn’t to back government. There were people in the administration who were opposed to war, I posted a link to one, and people in the CIA who doubted the intelligence.
Everyone forgets there was a list of 121 reasons to invade Iraq. Not every one of them was an overt "act of war," but most of them were pretty bad.
One of the reasons I remember was Saddam sent assassins to kill former US President Jimmy Carter. Saddam was trying to build "the big gun", there's a TV documentary on this, basically a very large gauge cannon that was aimed at Israel. Saddam trying to buy uranium from Chad, we had a major internal diplomatic row over the investigation, where the ambassador's wife was outed as being a CIA agent—oh imagine that, an ambassador's wife is a government agent. The ambassador and his wife lied in the report about the results of the investigation—Saddam really was trying to buy uranium from Chad—contrary to her findings. Saddam's sons used the primary school system to collect young sex victims ... and daddy has a plastic shredder repurposed as a people shredder to take care of any complaints. Saddam used chemical weapons against his own people, the list as I said was 121 line items.
I’m not forgetting that. But the 9/11 confusion was intentional because the other reasons were either bogus, or insufficient legally for actually invading a foreign country!!
“The President’s son rapes women and doesn’t get prosecuted for it” is a terrible thing but international law doesn’t recognize that that justifies a foreign country bombing and invading and occupying and installing their own regime!
Well, there's no such thing as "International Law", and the big problem in the leadup to the Iraq War was trying to persuade the sort of people who believed that there was.
I was a big Iraq War proponent at the time (I was quite young). The argument that convinced me to support it was basically "Anyone who wants to overthrow a dictatorship and install a democracy is alright with me; this is a hostage rescue situation rather than an invasion"; the actual WMD issue wasn't that important for me personally.
I don’t believe in “International Law” as something that exists in the absence of specific charters and treaties, but charters and treaties are things that exist, and the UN was pretty clear that they did not consider their charter to justify the actions of the US in Iraq. More generally, the US has a very regrettable tendency to justify bombing and invading other countries based on rhetoric about “bad guys” and “evil”, while using the words “democracy” and “dictatorship” very selectively in a region of the world which includes such countries as Saudi Arabia and Iran.
The fact that bogus justifications were fabricated tells you practically everything you need to know about the legitimacy of the war. I used to believe that the intelligence agencies simply made a mistake about Saddam’s intentions and capabilities, but with hindsight it has gotten clearer over time that this was “decide on war first, assemble reasons later”.
No uranium was found in Iraq except the decommissioned weapons from the early 1990s. Valerie Plame was outed by a Washington post journalist because her husband had, on his trip to Niger, found no evidence of those sales and had written an op Ed in the NYT to that effect.
The uranium story wasn't fake, and you have made omissions. Saddam hadn't bought uranium from Chad. But Saddam did send agents to Africa to try to buy uranium. This is what the kerfuffle was about. Valery Plame wasn't in Iraq looking for uranium, Valerie was in Africa looking for evidence of Saddam's uranium buyers.
Its like saying Joe hired someone to murder you ... well, the murder never happened, so why are you upset ... just what business do you have, saying Joe should be tried for murder, when its obvious the murder never happened?
Mobile truck mounted uranium processing plants were found, destroyed, and people were poisoned by salvaging contaminated containers from the wreckage for household use.
Saddam didn’t buy any uranium, and that was the case against him. Not buying uranium is not having WMD. The US didn’t present as its case for war in the UN that Iraq tried to buy uranium but failed - and even that story is dubious.
And your analogy falls down. It’s as if there was a law saying you could kill a guy if he bought a gun to kill you - which would be absurd in itself - but instead you kill him because he tried but didn’t buy a gun.
I took Michael's point not as that Saddam was guilty for having a gun he failed to get, but rather that he was planning to hurt someone in the first place. Last I checked, conspiracy to commit murder *is* considered illegal.
It's like when you are asked to comment on a colleague's skills and you say "hmm yeah, he's OK I guess", you are conveying that actually the colleague is an incompetent oaf and you'd be happy to be rid of them.
Specifically in Soviet circles you could be punished for saying things against the state or against communism, so people learned to say things very positively, but less positively than they might. It was a code, as Scott says, because you weren't allowed to speak honestly using the correct words.
It’s like when Donald Trump says they are “fine people” on both sides as opposed to saying “these are glorious freedom supporters” for the people on the right wing side of a conflict. He needs to say something positive but he can’t bring himself to say anything more than “fine people”.
In my experience it’s clueless people who end up being the gullible ones when, in the throws of their paranoia (fear), they fall for a conspiracy theory or magical religious thinking. They want to believe it. Those types are perfect marks for grifters/scammers.
For the Lincoln example, you can argue that the journalists *know* most people don't read past the headline. So the speculative piece was an excuse for push the myth Abraham Lincoln was into Marx in the headline. But I accept the wider point.
For the science case. If you take something like the causes of Autism, the public have a great interest in it but are led to believe its just some random great mystery. The actual science is in the position of now making some definitive statements about likelihood. But none of this is propagated to the public lest it make unsavoury geneticists look correct.
They know that most people don't read past the headline, but they think that most people read headlines the way they do - that you read the headline to decide whether or not to read the article, and you know that the headline itself has no informational content, so if you choose not to read the article, you have learned nothing from the headline.
They know that headlines are clickbait, ie the purpose of the headline is to convince people to read the article - an article that may well then correct the view that the headline instilled.
They are *wrong* about this, but that's what journalists believe.
This is a great post. It would be even better if it explicitly acknowledged that the rules change, and that we are living in a time in which that change is rapid.
Rapid change should, and often does, undermine people's confidence in their ability to discern what is true from what is reported.
I'd highlight one particular change as having been explicitly planned and having backfired spectacularly:
Before Trump, most quality media organizations were committed to reporting on events neutrally. They always presented both sides of the argument, and avoided drawing conclusions.
The argument was then made that if one side is lying through their teeth and the other is telling the truth that this approach may serve to mislead more than inform. This sounded emminently reasonable to me in theory, and it came to pass.
Unfortunately, it has not worked out very well in practice. Being freed from presenting the other side's arguments has led to a great deal of disinformation and severely compromised my default trust level in articles appearing in the New York Times and Washington Post.
I suppose this is better than most such changes, in the sense that it was at least explicitly discussed and thought about.
Or maybe not. Maybe this illustrates how little value explicit discussion actually has, since our collective wisdom is insufficient to avoid serious harms.
I agree. Even the idea of getting "news" from "Fox" has changed. I'm guessing that most people aren't going to "Fox" for the "news", but getting it in their Feed. So, as one scrolls through their feed, are they able to do the kind of code-switching required to filter "fox" "news" properly. Also - due to the rapid change you refer to, new generations may be ingesting "fox" "news" in very different ways, which would also suggest that Scott's way of thinking about the topic may be antiquated.
I thought that the quoting indicated that 'Fox' was being used as a stand in for [insert heavily biased media source of your choice]. I think most folks would agree that Fox is widely respected as a news source (though perhaps it shouldn't be).
That said, I think it's fair to say "this is a game I'm not interested in playing". That's my stance. I feel confident being able to tell the difference between the cases more often than not, but since most of the news is not interesting to me anyway, I just don't expose myself to it. I don't need to constantly worry about getting it wrong in the edge cases and waste brain cycles on that.
Given the rare scenario someone wants my opinion on something from the news, I can offer my abridged first impression thoughts based on their summary with disclaimers, or I can dig in then. This has been working well for me, but whether it does is necessarily dependent on one's social circle. (There are some where even the disclaimer "I don't actually know anything about this yet, but from what you've told me," might prompt outrage.)
But I think a lot of people who distrust the news distrust it for the stories it *doesn't* tell. For example, my instinct on reading this article's first summary on the Lincoln/Marx topic was "and how many friends did Lincoln have? Is there evidence he favoured Marx's views any more than some others?".
Similarly, when the news tells me, for example, about some bad thing [big corporation] did, but doesn't let them speak up, I wonder if the corporation has an actual reasonable justification for their actions that's being swept under the rug (sometimes they do, sometimes they don't). Same with political parties, nation states, et cetera.
And I suppose sometimes they also do just screw up and "lie", but I honestly get the impression that's just because humans are involved and humans sometimes make mistakes - it's typically not an attempt at fabricating facts. (Granted, that might be an observation true for the Tagesthemen in Germany, on which I'm basing most of my opinions, who seem to at least *want* to take journalism seriously. Fact-checking can be hard, even for big players, but it's a very, very rare event that they screw it up completely.)
See also Scott Lawrence's comment, which gets into that failure mode.
Presumably at some point you're going to be faced with decisions like "should I evacuate in the face of the hurricane that's about to strike?", or "should I lock myself at home for the next two weeks while a deadly plague sweeps through town?" or "is World War III about to start?" For that, you'll need to know whether there's a hurricane or a plague or a major international crisis in the works.
And while there are reliable specialist sources for each of those, they are specialized and it would be intractable to follow then all. So unless you're planning to ignore the world at large until you e.g. suddenly notice the roof blowing off your house, you'll need some ability to look at a general news source like CNN and differentiate between "this is important, actionable information" and "this is hype". That's the game, and if you don't play it, it plays you.
"or that, you'll need to know whether there's a hurricane or a plague or a major international crisis in the works." These sort of problems have yet to arise without me finding out through completely different sources. I am perfectly willing to continue assuming this will be true. So I'm afraid you've done nothing to convince me that I need to play this game - but this definitely may be different for other people, and is a good caveat to keep in mind for the general case, yes. :)
Didn't Scott already write this essay? I think it might have been part of a much longer essay on another subject, and in the other version it suggested that middle class people, being one step closer to, and hence having a better mental model of, the sort of people who actually have power, are better at sorting the lies from the not-quite-lies.
To pick on the examples, though, I think you have far more faith than I do in the Washington Post's reporting on election fraud. I'm not saying that there necessarily _was_ massive fraud, but I can't see any mechanism by which the WaPo would be inclined to look into whether there was; as an organisation, the Washington Post had a fundamental incuriosity about any story that might help Trump (what _was_ the deal with those Hunter Biden emails anyway?), so they have no more interest in finding out whether there was electoral fraud than the Swedish government has in finding out whether immigrants commit more rapes.
The linked article is a perfect example of why I can't trust the WaPo's reporting on this subject, it's incredibly disingenuous. As slam-dunk proof of the paucity of fraud, it offers the fact that only a small number of double votes were found... but double voting is the dumbest and most blatant form of electoral fraud there is; I'd like to know how many mail-in ballots were stolen, either before or after delivery, and how many ballots were "harvested" in suspicious circumstances.
This article is the equivalent of "Kangaroos don't exist, I checked my back yard and my front yard and didn't find any".
"I can't see any mechanism by which the WaPo would be inclined to look into whether there was; as an organisation, the Washington Post had a fundamental incuriosity about any story that might help Trump"
If they shout from the rooftoops that there was 100% definitely no election fraud and anyone who thinks so is crazy, then some Watergate tapes drop and it's proven that there actually was, they're going to look very stupid. Most news organizations care deeply about their reputation and I propose this as a mechanism which limits the rate at which they make unverified factual claims. Someone in the editor's room is getting paid to think "Wait, but what about ballot destruction? We'd better make sure that doesn't blow up on us."
News organizations shouted from the rooftops that Trump-Russian collusion was a well-established fact, and that the 'real damning evidence' was just around the corner. It was later revealed to be a fabrication by opposition political operatives and they paid no reputational price for it. Indeed, some continue to make the assertion, despite the evidence not turning in their favor.
This and a dozen other examples where collusion to peddle a 'narrative' ended in discredited stories has not hurt specific news outlets. Part of the protection in this game is that they all echo each other. There's safety in a crowd. So long as everyone tows the same official line, nobody gets punished for getting anything wrong.
And this goes back well before the days of TDS. Remember Bush Jr. and the absolute confidence that Hussein had WMDs? Nobody paid a reputational price for blindly repeating that line either.
Did newspapers says Trump-Russia collusion was established fact, or was that pundits while the papers stuck to leading headlines and carefully hedged "According to the Grobnatz report..." statements?
Actual question, I'd like to see how much you've got on them committing to falsified claims on the issue. I don't know of any but journalism is big so I expect it happened *somewhere*.
I remember a lot of the stories she cites in the book, and was surprised at how many of those stories were outright lies - not just careful hedging. She identifies a number of statements that were later proved false, but were either not retracted or whose retraction was a minor footnote buried in section Q while the main story got front page. (More often there was no retraction.) She also demonstrates how the authors knew or should have known based on evidence available at the time that what they were reporting was not true on its face, or didn't stand up to even minor scrutiny. That falsehoods were credulously reported without any attempt at verification or falsification.
That's not to say the news media has stopped the sleight-of-hand word choices, just that there's no longer the line in the sand Scott claims they're unwilling to cross. The claim that "news is separate from opinion" is no longer true. A lot of opinion is now reported as news in the news section by news writers stating opinion as 'fact' without citing a source.
Again, I don't collect references that assiduously. I recall more than one example in the book, though, specifically regarding Russiagate. It wasn't just a phenomenon of accidental erroneous reporting. Certain people knew they were spinning fabrications into legitimate news channels and did it anyway.
This isn't even a complete list, as Taibbi himself has included others in different articles. I think it got so long he was tired of updating it continually.
> If they shout from the rooftoops that there was 100% definitely no election fraud and anyone who thinks so is crazy, then some Watergate tapes drop and it's proven that there actually was, they're going to look very stupid
This doesn't bother them much. As sclmlw pointed out, they've never faced any consequences for being badly wrong in the past, so why would they in the future?
Besides, Watergate only got revealed due to the WaPo putting investigative resources into the story. If nobody ever investigates electoral fraud in 2020 then it will never get reported on.
About election fraud, and the possibility of a newspaper finding it: I saw comments online from a man who did election-monitoring in Iraq during their first elections after the fall of Saddam Hussein's government.
That man had lots of training from portions of the U.S. Government in how to spot indicators of fraud. Some of those indicators included things that actually happened, at a local level, in the 2020 elections in the United States.
Among those indicators are: heavy uses of Absentee Ballots outside of the limits prescribed in law, irregular practices in handling Mobile Ballot Boxes, election observers ejected for a portion of the count (or told counting stopped, only to discover counting continued while the observers were gone).
This isn't slam-dunk evidence, but it is suggestive that the elections were not 100% free of fraud.
Or of course you could just learn something about the underlying measureable facts of the situation, and be able to judge when the experts (or politicians) are shading and when they're being straightforward -- on a sound *empirical* basis, and not via either the amateur social psychology you hopefully picked up in your mother's milk and/or School o' Hard Knocks, or via a Jesuitical parsing of the exact linguistics.
I mean, this is what we do elsewhere. If I want to know which financial pundits are lying through their teeth[1], the best advice is to to dig in and learn something about finance, stocks, options, et cetera, master the vocabulary and math, and start paying attention to ticker symbols. Basing some critical judgment on the social psychology of journalists, or an elegant reading between the lines of their prose, is a very poor second best.
Well, let's see, I would probably start with learning some basics about guns -- what kind there are, what kinds are legal, that kind of stuff, so when I read a report I would have some background info that let me critically evaluate reports of the use of "an assault rifle." I would also have taken some modest time from adolescence, roughly, to pay attention to the very many crime reports that come from many different sources, so that I had years to decades of background info on the approximate normal rates of murder, and how they depend on location, what kinds of motives, correlations with gangs and drugs. If it were an issue in my city, I might go to a few city council meetings where the issue would undoubtably be discussed.
Then if I was not immediately familiar with the unfortunate town in the news, I might dig into what kind of town it was -- lots of places to get that info -- and think about whether what I already knew about the correlation between violence and nature of the burg made sense for this particular city, e.g. if it happened in Philadelphia I wouldn't be at all surprised, but if it was said to happen in Del Webb's Sun City in Retirementville UT there might be some heightened scrutiny I'd bring to bear.
Et cetera. This is just off the top of my head, mind you. If this were an important issue to me, and I really wanted to be able to form an independent basis for judgment, there's a ton of research resources I could access without moving my overweight ass from my desk chair. Click. FBI Uniform Crime Report, correlated by age, race, sex, location, clearance, et cetera. Another click. About a ton of think-tank studies on violent crime, and even specially on mass shootings. Another click: info on weapons used, by people from a dozen viewpoints. We live in an era of information cornucopia, if you have trouble finding independent data sources that touch on mass gun violence in 2022, I suggest you're not really trying very hard.
But how are you even going to get to the "unfortunate town in the news" part, if you're assuming that the media can't be trusted to tell you the names of the towns in which there were recent mass shootings and it's all on your own personal understanding of the issue?
I don't think in such black and white terms. That's not a sound basis for empiricism, that's a scholastic kind of viewpoint, which I reject. There is considerable daylight between "everything the media says is false" and I have no idea -- no corroborating evidence from my own experience -- that lets me evaluate how likely it is this and such story is true, so I guess I have to wander off into hairsplitting the exact terms and speculating on the psychology of the authors, like some medieval monk trying to infer the nature of the chemical elements by studying every syllable in Plato.
You're not a kid fresh out school, I'm sure you know how to do this, so what are you trying to say? We all weight the credibility of testimony, all the time, in our ordinary lives. Not everything my colleagues at work say can be 100% trusted either, nor people in my community, or strangers, salesmen, contractors, nor even my own friends and family, and so in each and every case I need to weigh up knowledge I have from my own direct experience to adjust my credence. I can think of no ordinary part of my life where my only choices are blind faith or a scholastic navel-gazing analysis of only the communication itself -- where I have *no* empirical experience bearing on the subject to assist. Still less can I think of any part of my life where it's *important* to me to learn to evaluate the testimony of strangers critically -- and I can really think of no background knowledge I could myself gain that would ease that task greatly.
FWIW, Samnytt.se (the news outlet referred to in the "immigrants' crime in Sweden" part) is basically a Swedish version of Breitbart News. Radically right-wing, racist and with a VERY lax view on journalistic integrity and – which the entire blog post is about – the truth.
Not saying the basic premise – researchers find link between immigration and crime and get heat for it as a result – is false, just that this particular news outlet is not trustworthy. (Source: am Swedish too).
Not think. Not related to ideology. Knows from experience. But again: it doesn't take away anything from Scott's main points. Just a little unfortunate to use this particular source for the argument.
I think that makes it an even _better_ example! You're claiming that you "know" they're "untrustworthy", and I accept that you're sincere and probably not trying to deceive anyone, yet you also agree that the particular article referenced is (basically) correct.
If anything, it's an even better ('double') example for the argument being made.
No it doesn't. Every EU country has these types of "alternative news sites". They are known for very actively lying; basically their whole premise for existence is importing the U.S. culture war BS to Europe. Some are funded by Kreml directly, and their servers are hosted in Russia more often than normal news sites'. The official side calls all of it "hybrid warfare". Personally I don't take it so dramatically, but still basically ignore anything those sources write; they're usually very open about it because they know their target audience. That article too, I checked the "About us" before even starting to read, so I didn't. (To be fair, they're hosted on a U.S. server and perhaps the site layout isn't as attention-grabby/spammy).
Okalmaru put it very well: this was a very unfortunate choice of source from Scott.
Reminds of early days of internet where "Google" or "Wikipedia" might have been quoted as a source. What you CAN do with these sites is take their links and follow them. Read the originals and make up your own opinion. They often raise interesting issues (prosecution due to ethics of research) but then totally misrepresent them in the text (all of Swedish science-production going through some kind of woke censorship). Sometimes the issues they raise are non-issues that is adequately and understandably explained by simply reading the original source.
What Scott did was quote something is like Wikipedia: not a source, not a journalistic product, but just someone's opinion in the internet. Like a random comment in a random comments section.
Yes, both these things are true: Samnytt is an alt-right publication, but their reporting was essentially correct anyway in this case. So it's a case of "not wrong, but a more credible source is preferable anyway" (if for no other reason than to avoid having to have this discussion first every single time).
If you want a link to a respected newspaper, you can use this:
You are also absolutely correct that the argument against the study isn't erroneously saying it's false, but claiming that these kinds of studies shouldn't be done as they will have harmful effects, that the person performing the study must be bad or they wouldn't have made it in the first place, desperately searching for a legal argument to discredit the scientist (there is exactly zero percent chance that the legal technicalities would have been an issue if the study had reached the politically correct conclusions), and so on, and so on.
One of the most strained arguments, from the National Council for Crime Prevention, was "maybe not drinking alcohol creates more rapes?" Yes, _really_.
But even so, looking at the TT you get a normally written news piece that isn't attention-grabby like many of those "alt news sites". Nothing about the censorship case (maybe it's another TT article?).
Samnytt text just was colored with more outrage, precisely the toxic stuff that prevent me from looking at Twitter directly (I only open links to it from "trusted" sources, like IRC, or here). It's not just that facts are right, the tone is for me what makes it spammy or tolerable.
GP's columnist didn't do interviews like Samnytt seems to have done. So in that sense that Samnytt's article was superior to this GP piece, since they got the voice of the researchers in the writing. It's sad when traditional media can't do their job more professionally.
I don't see a point in going into detail about this particular outlets' agenda here. It's not what Scott's post, which I generally agree with, is about.
Ten years ago, during the whole muslim psychosis, I do remember this sort of whole cloth lying a lot, though.
Does anybody else remeber all those stories about "no-go-areas" in Europe? All the while there were Europeans *living in those very areas* on the internet yelling that this was crazy?
I mean, "no go area" is pretty subjective. Most major American cities have areas that random middle class whites would be advised are "no go areas"; this doesn't mean that you'll necessarily get killed every time you venture in there.
Having said that, the fact that an area is "no worse than a US ghetto" is no consolation to someone who never had anything as bad as a US ghetto in their city until fifteen years ago and now finds themselves living next to one.
Not as deadly as in the USA, but there are certainly high crime areas where firefighters and ambulance drivers need occasional police protection and public transport is sometimes attacked. In Brussels 30 local youth held two police officers and prevented them from calling backup for some time, until one of the officers could talk them down in arabic. This might not strictly be a "no-go-zone", but certainly a "thread-very-lightly zone"
Same for Sweden. There are places you need to send two police cars every time, since if you send only one, the car will get vandalized once the police are out of sight from it. Police escorts for ambulance, attacks with rocks from overpasses and bridges on both, and so on.
Fun anecdote: My brother is an army officer who had to organize patrols in Brussels and Antwerp after the attack on the Brussels Jewish museum. He got maps with certain neighborhoods marked as "no-go" in red. Though this has probably more to do with someone higher up not wanting to stigmatize or provoke the local population than literally being to dangerous for the army to enter.
Most claims I have seen regarding "no go areas" indicate that these areas are not patrolled by the police, with a very strong implication that even (or especially) the police are afraid to go there. That's not terribly subjective; you can just count the police cars and uniformed officers.
Well, *someone* can. It might be annoyingly tedious and expensive to fly over to Sweden or wherever and do it yourself. But if you e.g. have access to a blog where smart nerdy types from all over the world gather to talk about whatever interests them, you could probably ask if someone has local knowledge of the matter.
Oh yeah, that was fun to watch from London. It was, like, guys you get that no one has guns here, right? Gangland warfare doesn't mean that everyone is hiding behind their engine blocks from the hail of bullets, it means a bunch of teenagers got into a knife fight outside Asda.
Yes, we also don't much care. Having to hide from criminals who might stab us to death is only slightly less frightening than having to hide from criminals who might shoot us to death.
What would actually be useful information would be the extent to which your criminals preferentially target only other criminals (because attacking e.g. tourists would bring major heat down on them all) vs. preferentially targeting outsiders (because e.g. that cements their territorial claim and touristy outsiders in particular carry extra shiny). But that's not the information that is usually being offered by reliable sources, and it can be hard to track down.
I think the difference is in falsifiability. The no go areas claim is sufficiently vague noone can conclusively prove you wrong, at least without a long argument about defining terms. But if you say "x people died" or "x person did it" those can be falsified, and in the latter case you can be sued
I was living in one of those areas, and I was not yelling that it was crazy. To me it seemed like a serious problem that firetrucks and ambulances needed police escort when entering the area.
Excellently written. Your gift is appreciated. Reminds me of my youth when George Bush Sr. laid out his doctrine on a New World Order. It was like God had finally spoken…then my grandfather educated me about the use of New World Order in history. Damn.
A quick note on the recent, raging "Expert Failure" debate.
It appears to me that the experts have suffered a corruption of the systems they are a part of. Without getting into the weeds on what corruption means in this context, let's say that the reputational risk:reward on honest communication has become such that honesty is heavily disincented. Some of us have heard countless examples of "behind closed doors, my expert friends say so and so, but they wouldn't dare say it publicly" in the last couple of years.
So I suggest a solution to this: how about anonymized expert networks? This way, we get to hear from the experts, without any risks to the experts. Kind of like the semi-dark expert networks that private equity shops heavily lean on.
Similar to Metaculus, but with (an apolitical, test-based) screening for expertise and a focus on deep insights versus predictions. Would be nice if Bill Gates or some billionaire would set it up and provide compensation to the experts.
In our desperate search for truth in a post-truth world, filtering for expertise and adding anonymity may get us closer.
If it's accessible to the general public, there would be a lot of pressure to shut down such things. Just like currently social media companies are pressured to censor certain things/people.
I don't think much of the general public is actually going to consume InfoWars or Stormfront. But people will work to make both inaccessible.
There's currently a forum called "Econ Job Market Rumors" where anonymous grad students gossip about job opportunities, and also badmouth certain econ papers & economists. EJMR is considered a scandal because these anons will write offensive, politically incorrect things. It gets blamed for creating a "toxic" environment in econ:
Laymen wouldn't bother reading EJMR at all, because they don't care about the job market there. But if it weren't focused on the job market, it would be considered a threat not just to women in econ but the general public.
What makes it even harder to "know the game", is that it is not just one game, but that every scientific community develops their own rules. Climate scientists follow a pretty different set of rules than neuroscientists. If you are savvy enough to read papers from climate scientists, that does not make you savvy enough to read papers from neuroscientists.
And of course, all the same for journalism. Tabloids follow different rules than broadsheets. The science part of a newspaper follows other rules than the politics part or the sports part.
Being savvy includes knowing which articles and statements you can interpret right, and which ones you can't.
What rules? I've never seen such a thing in action. Do you have some kind of illustrative example? I'm neither a climate scientist nor deep into neurology, but I have no great problem reading papers by either. It's still English, not Sanskrit still less Linear A. There's often a ton of unfamiliar acronyms and such you have to look up, but that's why God invented google.
To be sure, I am not going to be in a position to make some finely-tuned judgment call of whether this shade of conclusion is 5% more probable than that other -- the kind of thing that exercises the people right at the frontier, leads to dueling 30-min talks at the next big conference. But this is a long way from being completely unable to grasp the degree of solidity with which major broad cross-cutting Claim Foo is seen to possess. I've never found that to be a big problem, if I'm willing to put in the time required to bone up on the terms of the discussion. In what sense is this some kind of opaque process, where even an expert in one field is shut out of grasping what's going on two fields over? That really doesn't match my experience in science, which is that surprisingly distinct fields have *more* in common than one would naively think.
I have switched fields from computer science to neuroscience at some stage of my career. At the beginning, I was pretty much lost, and the big changing point was that I found someone who could tell me things like "yeah, don't trust this paper, they claim that they count synapses, but it is not really synapses that they count".
Another classical "lie" in neuroscience are statements of the form "region A projects to region B". Of course, it is not a complete lie, but the truth is usually closer to "the connection from A to B is slightly stronger than the connection between two average regions".
Of course, that is fine because experts know these caveats. For example, they know that every computer model that is based on such anatomic connections must be heavily discounted. Authors of modelling papers know that, too. But in the paper, the only sentence about this is something like "Using anatomical data, we model the projection from A to B". I think that a typical modelling paper is pretty misleading for an outsider.
Or yet another example, Scott wrote in his other recent article on the EEG study, "I'm skeptical of social science studies that use neuroimaging". I agree with that, and I would more generally take such caution with neuroimaging studies, even in neuroscience. But that is specific knowledge about neuroimaging, not general knowledge about science.
Er...sounds to me like you're saying when you switch fields you start off rather naive, and need to learn a bunch about the new field, including the working definitions of many specialized vocabulary words, before you can usefully contribute. It's difficult for me to imagine any situation or social structure in which that would *not* be true. Experience and experientially-derived knowledge are a thing. That's why we can't learn everything important from a book, or Wikipedia, and I imagine every expert in every field would say that is true about his particular field. If you decided to be a plumber or house painter or grow wheat you would also need a great deal of experience-derived knowledge before you were able to do the job competently and efficiently.
So this is a long way in my mind from saying that scientific or technical fields are deeply tribal, e.g. that There Are Rules of how you can and can't say things in this community and they're not the same as in this other community -- sort of the way things are in the ideological aren a, e.g. if you're calling yourself as "Rationalist" or "part of the reality-based community" or "woke" or "red-pilled" or just "blue" or "red" (in the US) then there really *are* unmentionables and shibboleths that cannot be questioned. Very different situation, to my mind.
Scott's point was not that newspapers are tribal. If anything, then it is the opposite, because the red lines of "red" and "blue" newspapers are pretty similar, like not to report falsely about an official police statement.
And even for The Rules, I don't think it's so different. There is an informal code for how you are allowed to criticize other people's work. I would claim that is not allowed to write "We should be skeptical of modelling studies" in a neuroscience article. (In peer-reviewed articles. It is totally ok to say that in private conversations.) It is allowed to convey this message in an article, too, but only with specific formulations. And I don't think it is trivial to understand such statements right if you are unfamiliar with the field.
I don't think it's just general politeness either. My impression from computer science is that there are less taboos about how you can criticize other people's work. If something's wrong, then it's wrong.
Well I'm doubtful a priori. I would instead guess you are perhaps wrongly interpreting the resistance you are seeing, the actual root is "you don't know enough to be able to critique this or that accurately yet" and not "you aren't allowed to say it or say it this way." You're jumping to the social explanation first, because that's the easy one, the natural human go-to explanation for weirdness, whereas it's more likely in my experience that the issue is that your familiarity with the facts of the field is as yet insufficient to let you make your point with the nuance people expect.
It's like if I wrote a paper challenging the Higgs mechanism by saying "this is all bullshit because there are other possibilities for the data, e.g. these three" -- and proceeded to list three that had been long ago carefully considered, because my familiarity with the field was yet limited -- I would get strong pushback. I could interpret that as "you're not allowed to criticize the dominant paradigm" but the real reason would be "if you're going to criticize the dominant paradigm you have to have all the background at your fingertips so you don't do it in an annoyingly boring way where people have to point out yeah that issue was raised and thoroughly discussed in 1973 so RTFM wouldja?"
Funny, my impression of computer science is the opposite. My impression of programmers as a tribe is that they are unusually brittle, psychologically speaking -- have a much harder time accepting that not everybody can, or ever will, agree on The One True operating system/way of programming function X/correct way to criticize other people's work/acceptable way of asking the girl at work out. They tend to insist on black and white even long past the point where the rest of us, given the muddled state of evidence, agree to call it a shade of gray somewhere between your preferred Pantone and mine. They tend to get into orgies of debate over The Rules because they are far less flexible about the role of rules in behaviour than other tribes.
There are a bunch of unwritten rules in journalism that are helpful to know.
For example, when I started reading newspapers during the Nixon Administration, I saw frequent references to "an unnamed senior Administration foreign affairs advisor said..." I assumed as a child that this could refer to any one of a few dozen officials. I only found out years later from reading Henry Kissinger's memoirs that it meant "Kissinger."
Agree. It's also got a comment section full of the "I totally agree with nearly everything, but here's why your examples about my side of the culture war are wrong and bad, unlike your examples about the evil people on the other side" nonsense that's making it close to unreadable lately.
That sounds like a fully general counterargument. Let's say one side really was worse in some aspect - we should be able to discuss arguments for why that might be the case. (see "Bulverism")
That might make for interesting comments on a post about asymmetries in cognitive errors in groups with differing political and/or cultural beliefs. But when the comments are full of the butthurt complaining about their ox getting gored while Scott, who is clearly a [communist/fascist/aren'ttheythesamethingreally], is not goring the other guy's ox, it's not very interesting, just tedious. [edited for clarity]
Not, it's just not seeing the forest for the trees. The article was full of scissor statements. The point of the article was not to get hung up on particular controversial topics and see the larger point - because there is a larger point. But that's something that goes very much against our nature, requires executive function, i.e. effort and discomfort. That's why I said that feeling uncomfortable is a good proxy for it being useful.
Those interested in a rigorous look at immigration as it relates to sexual criminality in Europe should read Ayaan Hirsi Ali's latest book, Prey: Immigration, Islam, and the Erosion of Women’s Rights.
I agree with Scott that it is a really important skill to "bound your (dis)trust" when interpreting public statements (or your friend Tina when she says the food at this new restaurant is great and you should go there).
What I disagree with is the sense I get from the article that this is a binary skill (you get it or you don't). I think this is a very hard task, everybody struggles with it to some degree and it is often impossible to figure out what's the right amount to trust (or what the exact bias is). Your own priors will also determine how much you should trust someone or what to take away from the statement. Really, its just a special application of Bayesian updating and we know how easy that is in practice.
Case in point, I think "the WP says the election was fair" should be compared to "Saddam has WMDs" rather than "mass shooting in NY". Why? Because these are the two cases where the media coverage, to a first approximation, can be explained by the fact that it repeats the official governmental position on issues where the media have a lot harder time if they wanted to endorse a different position (much like the problems the swedish immigration crime rate study experienced). So if somebody is convinced that the election was rigged despite all official bodies saying it wasn't, the WP article isn't going to change their mind based on bounded distrust.
I think the "binary" part of the skill is whether you trust that you're 'good enough' to extract any useful info generally, not that the skill itself is binary.
I think you're correct, but I disagree that it's a good conclusion. Knowing that we can glean some good information from a messy mix of truth and lies may lead us to read and internalize information that is false that we fail to separate properly.
I would take a different lesson from the examples you provide:
1. School shooting- All news portals would say that the same person killed the same number of people at the same school. There are very few variables, like the number of victims, name of the killer, etc, and the values of these variables are not open to interpretation. You cannot say that Abdullah looked like a John
2. Election malfeasance- The only direct variable involved is "Election fair=True/False". This variable is impossible to measure directly. Hence, either party is free to choose other variables that are indicative of the value of the direct variable. For example, Fox News might choose "trends in past elections in swing states" and say that time-honored trends were not followed in 2020, indicating that the election was not fair. Washington Post might contradict this analysis, and so on. It is only when direct variables cannot be measured, and we have to study indirect variables that we are free to choose in the manner of p-hacking, that news becomes open to interpretation.
I take your point about experts not willing to sign a petition with false claim. But this is manifestly untrue when the issues involved are political. Middle school education experts sign petitions with the claim that more funding poured into middle school education drastically improves the education outcomes of students independent of IQ. Also, Bill Clinton, the expert at having an affair with Monica Lewinsky, lied about his affair with Monica Lewinsky. Self-interest can muddy the waters significantly for experts even when talking about their own fields.
> Bill Clinton, the expert at having an affair with Monica Lewinsky, lied about his affair with Monica Lewinsky
When giving evidence, they had a long discussion about what constituted "sexual intercourse", decided that blow jobs didn't count, and he then, truthfully, said that he never had sexual intercourse with her.
It is one of the classic examples of the non-lie lie.
Specifically, they gave Clinton a definition of "sexual relations" for the purposes of the question that was probably intended to include blowjobs, but had enough wiggle room in the exact wording for Clinton to parse it as meaning that she had sexual relations with him when she gave him blowjobs, but he did not have sexual relations with her.
This is great, and I plan to share this with friends.
I would be very happy if I felt that most blue tribe people would agree with this. But my experience is that no, they won’t. They get angry and mad if I say that the New York Times isn’t really reliable source because of biases.
If that is what you got from the article, "The NYT is unreliable because of biases," then I am sure your blue tribe friends will not agree with you.
[edit: This is basically equivalent to me saying, "Yes! I agree with the article. Red tribe people _are_ mostly incapable of logical thought. I hope my red tribe friends will finally see that I have been right all along."]
This obviously depends on the individual. My middle brother is Downs Syndrome. For him, a reliable source is his family. For me, NYT is fine. Fox would be fine too but a smaller percentage of the articles on NYT fill me with rage for being manipulative and/or disingenuous. There will never be an article in any publication that will define my understanding of an issue. They are just datapoints.
With this philosophy and a willingness to roll your eyes some percent of the time, I suspect you too could find useful data on NYT. Or is the issue that you assume that most NYT readers are sheep and being swayed to a political ideology you find repugnant? (by an occasional disingenuous article)
> Fox would be fine too but a smaller percentage of the articles on NYT fill me with rage for being manipulative and/or disingenuous.
Just this phrasing here is one that i think most of my blue tribe friends (and this includes a fair number of siblings) would categorically reject. I have 8 biological siblings. None of us voted for Trump. Two identify as moderates; the one that i think really is moderate is generally seen by the rest of us as conservative.
The biggest split between us, i think, is best defined in terms of either:
a) general distrust of all media sources as being, in your words "manipulative and/or disingenuous" (this is where i am)
or - and this is harder to exxpress because i'm still unclear on what it is
b) there are heavily biased sources (fox is an example, but so is, say, the huffington post or any political commentary show), and then there are sources which are more or less reliable
Those of us who think that corporate media is fundamentally unreliable blame it for getting trump elected. Whenever we try to bring these points up - that _all_ sources are biased and disingenuous, especially when it comes to their biases, i repeatedly encounter a rejection of the premise here. If I try to argue that, say, some fields have a general left wing bias, this gets rejected as well.
I would see the world very differently if i thought people saw scott's blog as the standard for journalism. I do consider scott's blog to be a reliable source of data, and what mean by this is, i don't feel the need to investigate to try and figure out what scott is lying about. This doesn't mean that i'm a sheep and will just believe whatever scott writes.
Your responses are antagonistic enough that i'm going to disengage here, unless your rhetoric changes. Sure, i could possibly get useful information from an exchange with someone how is aggressively antagonistic and not interested in understanding where i'm coming from, but why would i bother doing that - or reading sources which i expect to 'fill me with rage' - when i can just _not_ do those things?
My first reply may have come across as antagonistic. I may have misunderstood your original position. Possibly based on that you misread my second reply? It did not feel antagonistic to me. I certainly was considering myself as possibly equally guilty of the behavior I was describing.
On the other hand, I don't actually see what it is that you think we disagree about which is just as good a reason to end debate...
This is partly a test. This post and the more recent one about poverty and EEGs won't load on Chrome. At first, they would load and then quickly switch to "too many requests". Now this one will load from Opera but not Chrome.
As for the topic, this isn't just about news source, it's also about cancel culture, both right and left, which are based on deciding that some source is completely disposable.
> This is partly a test. This post and the more recent one about poverty and EEGs won't load on Chrome. At first, they would load and then quickly switch to "too many requests". Now this one will load from Opera but not Chrome.
I'm having the same issue on Chrome, did not replicate on other Substacks. Easy enough to work around that I haven't dug deeper yet, but it appears to be affecting older posts on ACX as well. (But curiously, not the comments-only pages.)
Poking around a little more, disabling ACX Tweaks made the issue go away. Can't point to exactly what was causing the issue without further investigation, but I pinged Pycea about it.
"They don't talk about the "strong scientific consensus against immigrant criminality". They occasionally try to punish people who bring this up, but they won't call them "science deniers"."
The first statement is true, the second statement seems ~ false. When people bring up similar results they are often accused of "peddling pseudo-science", which seems functionally analogous to "science deniers." If someone asserts 'immigrants commit a disproportionately large amount of crime' they wouldn't be called "science deniers" only because it doesn't generally make sense to call someone a denier for asserting a positive claim.
Noun-ifying things that you hate does seem to be a common trend, though. You aren't just someone who partakes in a certain behavior or holds some belief X - you're an X-er, an anti-Xer, an X truther, an X believer, an X denier. It's a way to otherize.
I think the issue I have reading this is the impression I get that you, writing it, and everybody reading it, is going to think "Oh, I'm smart enough to have the correct level of distrust". It smells like just-world theory, only about intelligence instead of morality.
"But also: some people are better at this skill than I am. Journalists and people in the upper echelons of politics have honed it so finely that they stop noticing it’s a skill at all."
If it's a skill then there isn't a "right" level per se, though there can be 'good enough'. Scott's acknowledging he isn't flawless, though as always one should have less faith in readers than writers.
First, that's exactly what I'm talking about, when I say it smells like a just-world theory; if you get tricked by journalism, you just weren't skilled/smart enough, so really it's your own fault for not understanding what was going on. Skilled/smart people don't have this problem, it's the ignorant rubes.
But notice something: Nobody is going to think they are the ignorant rubes. This is a post which claims to illustrate something, but if you think about it, it's just telling people what they want to hear - that they have this skill that nobody talks about that lets them detect the Truth in Media.
Because, with exceptions not worth talking about for the purposes of this discussion, EVERYBODY applies a level of bounded distrust to every new agency; nobody (for the purposes of this discussion) think the news agencies are lying about whether or not it is currently raining in the city they're reporting from. Everybody thinks the media is lying to them about [insert opposing political/tribal belief here].
And Scott acknowledges there are certain times the media is fine with lying; there are things you cannot trust the media about at all (otherwise it wouldn't be necessary to clarify that the media doesn't lie about these particular kinds of things - it is fine with lying about other kinds of things), but there are other areas where they're careful. There are rules! You can understand the rules!
So somebody who is a climate skeptic can say "Well, this is one of the areas that the media lies about, the rules allow it here." Somebody who thinks the election was hacked can say "Well, this is one of the areas that the media lies about, the rules allow it here." Everybody can read this and think to themselves "I'm one of the people who understands the secret code of the media, I understand how and why they lie, and I can see into the truth of it." And think everybody who disagrees with them is an ignorant rube who (not exactly through fault of their own but also really if they were smarter or bothered to try to get better at this it wouldn't happen) is misled by the lies of the media and/or by their own paranoia about what the media is lying about and how and why.
(Not to even mention the fact that even if there were a set of intelligible rules in this sense that we could even agree on, as soon as they became public knowledge, they'd change, for roughly the same reason that you can't have common knowledge about how to beat the stock market.)
> But notice something: Nobody is going to think they are the ignorant rubes. This is a post which claims to illustrate something, but if you think about it, it's just telling people what they want to hear - that they have this skill that nobody talks about that lets them detect the Truth in Media.
Hard disagree - think about it as a skill like violin playing, and the answer is obvious. The vast majority of people are very poor violinists... and don't spend their time playing the violin. I know plenty of people who voluntarily decline to engage with any meaningful level of news journalism, who don't find it meaningfully blameworthy to not possess the media literacy to navigate the existing ecosystem. You just won't find them online, in the comments section, on a post on truth in media. The selection effects are obvious.
More to the point, I don't see anywhere where Scott makes even the implication that the reader of this piece is particularly skilled. Did I miss something, or is that a novel inference?
> (Not to even mention the fact that even if there were a set of intelligible rules in this sense that we could even agree on, as soon as they became public knowledge, they'd change, for roughly the same reason that you can't have common knowledge about how to beat the stock market.)
No. Anti-inductive behavior is recursive, most typically when predictions are fed back into the behavior of that which they predict. In contrast, editorial strategy is a balance between appealing to the readers' moment-to-moment interests in the short-term versus maintaining integrity for longer-term credibility. You might see a feedback loop of "improvement" when the tradeoff is not being made optimally (think movement towards a Pareto frontier), but this isn't the same as a motivated liar trying to scam a suspicious mark.
People realizing tabloids are trash will not stop tabloids from being trash, because they are not trying to be not-trash.
For the entire first section of your response, I'm satisfied with what I have already said on the subject, and see no need to continue that line of conversation.
For the second, if what you said were the case, we should expect to see "trust in news media" to be relatively stable; it's clearly not the case that editorial strategy is succeeding at maintaining integrity for longer-term credibility.
More, we have concrete evidence that editorial strategy has included policies which look an awful lot like "a motivated liar trying to scam a suspicious mark", in the form of previously-private message board conversations in which editors coordinated to lay out strategies of what to cover, and what not to cover, and how, for explicitly political purposes.
Given that editors are explicitly trying to direct public opinion, which requires successfully predicting behavior, then if people become aware of the strategies employed to predict their behavior, they will change their behavior in response. This is exactly the kind of situation which causes anti-inductive behavior.
> For the entire first section of your response, I'm satisfied with what I have already said on the subject, and see no need to continue that line of conversation.
I would actually like a response on where Scott implicated that the readership uniformly falls into the "savvy" category. IMO, the SSC comments sphere actually selected fairly strongly against media literacy compared to LW. That's probably an unfair comparison given different audiences, but I don't feel terrible about having high standards.
I'm comfortable breaking with Scott on this topic if necessary, but the article is pretty notably not written to either group in particular. Contrast how Conflict Vs. Mistake explicitly staked out one particular side both by Scott and the blog as a whole.
> For the second, if what you said were the case, we should expect to see "trust in news media" to be relatively stable; it's clearly not the case that editorial strategy is succeeding at maintaining integrity for longer-term credibility.
First: "news media" isn't an organization and doesn't have unified incentives. See also: approval ratings of Congress v. "my Congressman". This is a critical distinction if you find yourself trying to predict the actions of actors that don't actually exist.
Second: are you really comfortable assuming away structural changes in content delivery or audience preferences? The media landscape has seen nothing but a series of exogenous shocks going back at least four decades at this point, and I wouldn't have the faintest idea of how to begin controlling for that.
Third and with a hope of injecting some empiricism: how unstable do you think "trust in news media" *is*? Scott's commented on the Economist / YouGov polling on the topic before, and pulling in more recent 2020 data it looks like the change in weighted net trustworthiness since 2016 is NYT -1.5 less trustworthy, WaPo -0.5, WSJ +2, CNN -4, Fox +1.5, MSNBC -0.5. Given the healthy ~3% margin of error and the clear takeaway is that... nothing has notably changed? Is that what your model was predicting?
"While I've heard rare stories of the media jumping in too early to identify a suspect, "the police have apprehended" seems like a pretty objective statement."
The irony is that this literally happened last week with Malik Faisal Akram in Texas.
I understand the point being made, but I think part of our current problem (with vaccines) has to do with how much trust the pharmaceutical industry has burned in the past 25-30 years. There are some few people who won't get the shot as a political marker, but there are others that look back to the opioid epidemic and the hand-in-glove relationship with our regulators and the hair raises on the back of their neck.
We live in a world where there is no universally recognized truth Pope who can bless stories as being basically correct. Without that no one is smart enough or has enough time to review everything in enough detail to be sure it’s right. I know you just said that, but… god it’s depressing sometimes.
I would be less concerned if I didn’t also have the sense that they’d also deluded themselves. I can never get over the Twitter variant of: “I talked with my three year old today about complex geopolitical issues and they were immediately able to comprehend all the nuances and agree with my political opinions!” Which on its face you know never happened and yet my guess is that any person tweeting that would be able to pass a lie detector test on pure power of will. I don’t think it’s as bad as we fear but I also think it’s a situation where not that many people have to be bad actors before the entire lake is polluted.
For all of the ink that’s been spilled about Rogan’s medical misinformation, it’s a point he returns to fairly often. A lot has been made of the Robert Malone episode, but the John Ambramson episode immediately preceding it was much better content in this regard. (Abramson, for what it’s worth, is a proponent of the vaccines)
My throughline on mistrust runs backwards from Covid vaccines are safe/masks don't work etc back to opioids are awesome, picks up some SSRI boosterism/side effect denialism, continues back to cigarettes don't cause cancer, pesticides are safe, and radiation is good for you. Also, that groundwater there is not contaminated and you'll be fine.
If it were not for the threat of how sick Covid could make me, I would have passed on the mRNA vaccine (of which I've had three) because I do think corporate-sponsored science is absolutely famous for saying "it's totally safe" not just when they don't know enough to say that but when they have actual evidence to the contrary that they're actively suppressing.
But that's just me and I'm not particularly proud of it. I don't love conspiracy theories. I do think corporations and the politicians that work for them have shown a lot of willingness over generations to lie and downplay harms that only become evident years later when the evidence is no longer deniable, but loads of money has been made meantime. And also that they're willing to factor in a lot of acceptable losses, so that their idea of safety isn't my idea of safety always.
[thank you for letting me insert this rant here; I feel better now]
Oh and that big plane there we've redesigned in a hurry under a narrow profit margin while regulating ourselves with no oversight, it's totally safe.
But meantime, if you smoke pot, you're going to become a drug addict and that magic mushroom/LSD stuff is definitely going to kill you, even though the alcohol your family is drowning in is perfectly fine.
I don’t think mistrust of pharmaceutical corporations can be described as a conspiracy theory. It was the uncontroversial norm as of 2019. In fact, Pfizer’s public perception of trustworthiness flipped from being in the bottom 10% to the top 10% within a year. There are plenty of documented reasons to cast suspicion on these guys.
I agree with you. I just meant dispositionally I'm not drawn to conspiracy theorizing AND I still have a huge amount of mistrust of a specific slice of things sometimes associated with conspiracy theorizing. Of course in my case, I feel like it's evidence-based, my mistrust.
Of course it's a conspiracy theory. It's a theory that people worked together to commit harmful acts, and didn't tell others about it.
This recent amalgamation of conspiracy theory being used as "people conspiring with each other, one of the most fundamental human behaviors" and "a crazy theory that only wackos believe that is definitely false" is very disturbing. It implicitly suggests that no one ever conspires, and theories about people conspiring must be false, by definition.
It's almost like the people committing conspiracies would want this to happen...
My theory is that aside from that people are being told to specifically distrust covid vaccines, the mistrust is built on personal bad experiences with the medical system, and that personal bad experiences are not part of the discussion because mentioning that might imply that the medical system should work on treating patients better.
On the other hand, I don't know whether there's less mistrust in countries with better functioning medical system.
There seems to be an agitating anti-Covid-vaxx population in most western countries, including those who I personally think have far better medical systems than we do. (I’m being an American by assuming you’re an American)
I can buy into your theory though, especially given the number of non vaccinated black Americans who have some historical reasons to give the medical system some side eye.
I tell you three times, this is not about Tuskegee. This about the doctors and nurses who ignored symptoms and ignored pain, and did so with great assurance, in the patient's own life, and in their social circle.
This article is probably the main reason I read Scott. I identified that he has this skill much better than I do, and would never intentionally state something he knows to be certainly false. If there is a chance of it being true or false, he used qualifying words or even percentage guesses. And he rates his guesses at least annually, to calibrate himself. Currently, I cannot name any other source that is both better at this skill than Scott, and this level of honest. If anyone else has suggestions, that would be interesting though.
+1 I would recommend nearly all of his blogrolls (at ACX and SSC) + some other substack-writers: Erik Hoel (got recommended by Scott, not a lot of posts, yet, but good; one of my favs: https://erikhoel.substack.com/p/publish-and-perish ) and Thomas Pueyo of "hammer and dance" fame https://unchartedterritories.tomaspueyo.com/ (he puts the more interesting half of his texts "for subsribers only" , but that is ok). But " I cannot name any other source that is both better at this skill than Scott, and this level of honest." And a better writer, I'd add.
You made similar point in another post: “it’s not that bad if experts get things about Covid right two weeks later than MTG players”. Yes they are like 2 weeks or 2 years (as with N95s) late on Covid. But on other topics they are decades late and not catching up. For example beliefs about education and signalling. You compared schools to child prisons, this is position comparable in its anti-expertise to denial of anthropogenic influence on global warming.
Two things: (1) Media bias isn't that hard to correct for, for people who have an interest in doing so. But the supply of biased stories is created by a demand for bias. Most people have no desire to correct for bias. The reading skills and thinking skills you discuss here (as with the Lincoln-Marx example and the government harvest prediction) are not nearly as hard as you make out for people motivated to suss out bias. (2) I'm usually more concerned about ignorance in news stories, which may or may not be filtered through bias in the experts chosen to ornament a story. My standard method is to pick stories in some given outlet which are about something in which you know more than the journalists. They almost always get it wrong. That usually isn't bias (although they may be biased as well.) They just aren't trained in whatever the subject is. You can use the difference between what you already know and what they report as Bayesian prior evidence for future stories in which you have no specific domain knowledge.
Isn't that hard to correct for for whom exactly? I think a big part of the 'culture war' divide (and all kinds of similarly polarized beliefs) is that, for most people, epistemology is almost entirely social. I find it depressingly rare for anyone to even _attempt_ to understand anything at a 'gears' level – even otherwise smart people!
On the other hand, I think you're right that, given sufficient motivation, people are (perhaps surprisingly) 'good enough' at doing this in practice. I think there's a very strong 'selection bias' in thinking about these issues mostly for the most controversial subjects and not noticing that there's quite a lot of knowledge/info that people mostly-competently handle.
Yes, Gell-Mann Amnesia is real and, you can somewhat correct for it, especially after you've experienced it yourself first-hand.
I agree. I don't think there's any way to find truth for people who aren't motivated to to seek truth over confirmation. The market forces leading to confirmation are just too strong.
Great piece. The screening ability may be somewhat (ha ha) rarer than Scott suggests. Wonder if changes in education over the past 20 yrs have affected the prevalence of the skill. Did no child left behind reduce or enhance adults’ truth detecting intuition?
The AIER article on Lincoln & Marx was written by Philip Magness, who regularly mocks anyone who takes Chinese data on COVID deaths/cases seriously. In contrast, Greg Cochran (of "creepy oracular powers" fame) realized how seriously COVID was based on the Chinese government's reaction, and regularly mocks COVID skeptics (whom he's also won multiple bets against) for thinking that large numbers of dead bodies are the kind of thing that could be easily disguised.
OK--I don't love Phil Magness either; I think he's annoying. And I agree that China trades with the world enough that we can be pretty confident about the ballpark of their COVID death numbers. But this seems to be more or less shooting fish in a barrel. Is there any direct evidence that Lincoln had even heard of Marx? Marx was not very well known until (at the earliest!) Capital vol. 1 came out in 1867... 2 years after Lincoln was assassinated.
Maybe Lincoln saw a letter wishing him well, maybe he read some articles about the Sepoy Rebellion or whatever irrelevant matter by Marx in the New York Herald-Tribune. But what actual reason (besides Kevin Kruse-esque twitter induced brain damage) is there to suppose that Lincoln had any significant connection to Marx? The idea is, to be frank, just absurd on its face. Maybe only an annoying guy like Phil Magness is willing take the hit for saying that in the contemporary information environment, but that doesn't mean he's wrong.
Oh, I think Magness is correct about Lincoln/Marx. That's within his expertise as an historian. Contemporary Chinese death stats aren't within his expertise, and parsing Chicom pronouncements for truth vs falsehood is outside as well. Greg Cochran, on the other hand, not only has domain expertise in disease, he can also engage in "bounded distrust" of others. For example, he says one reason he knew there were no Iraqi WMDs is that he regularly read the NYT (even while their own Judith Miller was hyping the threat at the tine) and remembers what he read.
Deleted my previous posts after I thought about this a bit more. The belief that "media literacy" is a skill seems to rest on a flimsy assumption: That biased journalists/experts are writing in a secret code where 99% of people will read it and believe a lie, but the truly smart people will decode the article correctly and find the truth. There's no reason to believe this always holds, even if it holds sometimes. A journalist with a single stroke of a pen could change the article so that there is no way to get at the truth. An organization could watch people successfully "decoding" the articles on Twitter, and adjust their writing style so that using the same decoder "key" will uncover another lie. It does not seem worth it to try to engage people who are writing in bad faith in such a way out of a belief that somewhere in there is a tiny speck of good faith.
>A journalist with a single stroke of a pen could change the article so that there is no way to get at the truth.
Well, yes, the pen could be an extra-wide sharpie that they sweep across the whole line of text. No way you'll ever figure out what the underlying truth was. And yes, there are lesser forms of obfuscation, but the end result is the same - pure confusion, serving not even the reporter's private interests.
The point is, biased journalists want to create the *impression* that they are delivering useful and accurate information, because without that they might as well just scribble with the sharpie. But there are still rules and norms as to what they can and can not do in the pursuit of their possibly-nefarious goals, and see Dan Rather for what happens if you cross the line. So, in order to create the impression that they are delivering useful and accurate information, they have to include some actually true and accurate information, and if you know the rules you can pull some of that out of the muddle.
Concisely stated, the rule would be to always believe extremely objective statements. As a corollary, somebody isn't making objective statements (aka weasel words) just ignore them. Good heuristic if you can use it (sometimes weasel words are just too common).
While I generally agree with that, it means scientific frauds can go for awhile without being discovered.
People close to him knew he was a fraud (source: somebody who was close to him told me) but Bell Labs management didn't want to believe that and we typically accept the raw data from scientists as true; peer review is about checking methods but assumes good faith. Schon was caught because he had reused figures in ways that couldn't possibly be anything but fraud. If he had been less lazy he might not have been caught. Makes one wonder how many uncaught frauds are out there.
It would be easy to write this off as an outlier, except outliers can have extreme effects.
The analogy to news might be using photos at the top of an article which are true but wildly misleading, something which happens all the time. Stock photos to set the scene are harmless, but often people cross that line too.
One extension I would like to add is this also applies to other matters like trusting governments (ie China). I was surprised to learn over the last few years a lot of people lack the skill to appreciate the scope and types of things the Chinese government can lie about.
Something else I want to add because I'm still upset about it.
Three years ago, Jeff Bezos publicly accused the Saudi government of hacking his phone when he knew this too be false. A 100% baseless claim. I only highlight this because I thought it was quite sad to observe that Jeff Bezos thought this was an acceptable thing to do and nobody seemed to care he did it.
To the point of this article, 100% outright falsehoods are really bad and those who commit them should be shunned.
Why do you think Jeff Bezos knew his claim that the Saudis were out to get him (rather than his quasi-brother-in-law) was false?
My impression is that major player insiders like Bezos, Hillary, and Trump tend to be conspiracy theorists. Disdain for conspiracy theorism tends to be most common among upper middle class small-timers.
I think Scott’s core point is trivially true—there is some core of objective fact to most (not all) media accounts that very few journalists or experts would actually lie about—but the larger post seems like a bit of a motte-and-bailey argument. Those of us who are radical media skeptics don’t take issue with that proposition in theory, we just think that category is much smaller and far less significant than the post implies, and that what gets reported in the first instance and what does not is carefully curated, and that selective reporting and omission of important context eliminates any usefulness of the media in all but the lowest common denominator sense (i.e. I believe that if the media reports a demonstration in my part of town, I can reliably predict increased traffic and logistical difficulty in that area). Scott’s post suggests it’s possible for the sophisticated to extract more useful signal than that; I think that’s cope by those who don’t want to acknowledge how bad the situation is or are concerned about the consequences of enough people thinking like that. But that’s, like, just my opinion, man.
The discussion of particulars in the comments here seems to kind of miss the point a little. The real issue is how much you should update your views based on even true information provided by those with the ability, incentive, and stated intention to selectively present such information to you. I think the answer is “not very much” and that goes to zero if a particular event is already in your “this happens sometimes” category.
One problem in 'hot debates' is that one side of journalists can simply refuse to make certain facts or nuances either in general or at least until the damage is done.
If you're trying to argue someone out of a misapprehension caused by lousy reporting you might be stuck resorting to sources that the person is primed to reject categorically on grounds of bias.
Note that the sophisticated tend to "extract useful signals" that happen to agree with the beliefs they already have. Other people who engage in the same kind of reading of tea leaves and come to disagreeable conclusions are being either gullible or conspiracy theorists; only the exact right level of distrust, which everybody thinks they have, nets you actually useful information.
"Scott’s post suggests it’s possible for the sophisticated to extract more useful signal than that; I think that’s cope by those who don’t want to acknowledge how bad the situation is"
I extract useful signals from the New York Times and Washington Post every day.
There's a set of folks that I'll just call "epistemic institutionalists" [I've heard the term intellectual authoritarian used but the 'A' word has negative connotations]
i.e. the idea that the responsible thing is to simply teach someone to be capable of identifying the institution that promulgates a fact/set of facts/narrative and either trust or dismiss what is said without actually sifting through the contents for value.
Now, the kind of skill that Scott is describing is basically that of an extremely high reading level which is at times combined with varying degrees of statistical literacy. There's also implicitly a certain temperament (emotional detachment) required but let's waive that for a moment.
The idea that under even the most ideal circumstances a huge portion of the population is not going to attain a particularly advanced reading level and/or mathematical aptitude, (perhaps because some underlying physiological trait that can't be significantly enhanced through environmental or medical stimulus) Is *extremely* taboo with the aforementioned institutionalists.
If you're an institutionalist you more or less have custody of the youth's instruction from the ages of 7 to 18 and beyond. If you decide what you'll do with that time is to drill your students in trusting and dismissing sources out of hand you're more or less operating on the assumption that the vast majority will never attain the level of skill needed to do what Scott describes.
Perhaps experts don't lie about immigrants committing more crimes than natives, but I think they come pretty close. E.g. in https://www.svt.se/nyheter/inrikes/kriminologen-jerzy-sarnecki-las-in-unga-valdsbrottslingar-lange Sweden's most prominent (in mainstream media) criminologist says that immigration has not increased the amount of violent crime in Sweden. It is very difficult for me to believe that this expert truly believes that in the absence of large-scale migration of exotic peoples, Sweden's gang-rape statistics would have looked the same.
It's pretty common for experts to make statements that naive nice people interpret wrongly. For example, lots of nice people believe the "Girl With the Dragon Tattoo" myth that the rape in Sweden problem is overwhelmingly due to neo-Nazis.
As Scott has more or less pointed out, the race/ethnicity-crime correlation is a massive unwelcome problem even for this website to deal with. There are some facts that are too factual to be stated.
This is, more or less, how I learned about covid in November 2019. I specifically follow right wing news aggregators because I see things I don’t see in left wing/mainstream news aggregators. Enough of it checked out and the way they presented it wasn’t the way I would have expected them to present it if it was entirely fiction.
COVID was being reported on in left-wing "prestige" publications as well at that time (though it was presented as a domestic issue in China). Perhaps the main issue is more that media outlets for "regular people", like cable news, simply operate differently and assumed there was no way to talk about COVID without people freaking out (and perhaps they were right, given their experience with turning ebola into a national panic despite it never having been a threat to Americans whatsoever)
Fair enough. I think it’s a question of volume/loudness. I’m certain you could point to any large group of people and find an early warning signal. I suppose it’s a measure of institutional effectiveness how quickly that signal made it to the top/official mouth/place of prominence. For me people who were otherwise, to use a technical term, batshit insane about things like Trump were entirely credulous COVID reporters as a group. I’m extremely distrustful of mainstream press (to the point I try to adjust my own internal thinking because I know my initial reaction is going to be too harsh) and while I wish your statement about not wanting to cause a panic are accurate I don’t think they have those kinds of assessments. I just try to stand far back on a hill and watch everyone fight and then glean who is telling the truth based on something like troop movements. That’s what allowed me to notice it early.
“ Perhaps the main issue is more that media outlets for "regular people", like cable news, simply operate differently and assumed there was no way to talk about COVID without people freaking out”
It seems a bit more troublesome than that, since cable news has, since March 2020, been as responsible as anyone else for talking about COVID in a way deliberately designed to freak people out.
> There are lines they'll cross, and other lines they won't cross.
Good post, but I think the real is that they're crossing that line because it confers some advantage to them, and that's likely because they *know* some people will misinterpret it. That's clear deception no matter how you slice it, and therefore we have to ask ourselves whether we should tolerate line crossing at all.
If I were to try and summarize it I'd say - they used statistical tools that don't start with a hypothesis - so immigration nor anything else was a "particular focus" (although presumably they assumed that some of the data they'd fed into the model would turn out to be relevant). However after the statistical tools were run, in their own abstract they say that a "key point" is "The majority of those convicted of rape are immigrants."
> What’s the flipped version of this scenario for the other political tribe? Here’s a Washington Post article saying that Abraham Lincoln was friends with Karl Marx and admired his socialist theories.
A better comp to the Fox scenario you describe would be just over a week ago several liberal outlets describing the Texas synagogue hostage taker as a "British man" and then covering a press conference where some fed said something to the effect "doesn't appear this had anything to do with Jewish people." Ok then!
> The 2020 election got massive scrutiny from every major institution.
The 2020 election was highly irregular and far from receiving scrutiny, every major institution continues to refer to it as the freest and fairest election in our nation's history. Including evil Fox News. Doesn't mean it was rigged but it received anything but "massive scrutiny."
> They occasionally try to punish people who bring this up, but they won't call them "science deniers".
Too good. "You may violate our women, suppress dissent, and make a mockery of any concept of democratic governance. But don't you dare call use science deniers." How to control a rationalist with this one easy trick!
> The reason why there’s no giant petition signed by every respectable criminologist and criminological organization saying Swedish immigrants don’t commit more violent crime than natives is because experts aren’t quite biased enough to sign a transparently false statement..
Why bother signing a transparently false statement when the government can just bring "misconduct" charges against them (only when results lead in a certain direction, as you point out)? As for the climate change example: a global phenomenon with global funding sources. No one government can silence dissenting voices so it is left to deep pockets to offer the carrot. Not saying the climatologist letter is wrong or a lie, but it's a poor comparable for the Swedish immigration study
This was a lot of words to sorta, kinda admit you were wrong about ivermectin. And all the reading of tea leaves and "sensing of the dynamics" does not account for an obviously coordinated effort to suppress and discredit a generic drug that at the very worst does no harm. From a bizarre media blitz of calling it "horse dewormer" to doctors losing their medical license for prescribing it and pharmacies refusing to fill it. This is unexplainable behavior, and so, they don't explain it. The harvest was meh this year, comrade. Pay no attention to those Golden Arches. Only horses eat there.
Speaking of bias, you can misrepresent a situation while doing nothing but presenting the truth. Imagine, for example, that FOX News took it on to report *every* case of a major crime committed by an illegal immigrant. They could have teams of investigators on every case, outclassing the police, reporting nothing that wasn't sourced to objective evidence or at least 3 separate, credible eye-witnesses. And it would still be a misrepresentation because it was placing undue emphasis on one group of people.
As always, thank you for your thoughtful consideration of important and complicated issues. I have a quibble:
"I think some people are able to figure out these rules and feel comfortable with them, and other people can’t and end up as conspiracy theorists."
That's a false binary; you present it as 'well-adjusted people who understand nuance vs q-anon nutters' when there are at least three categories. I'd argue that the people who blithely believe Fauci is a hero who Embodies Science and buy t-shirts and devotional candles with his likeness do MORE damage than people who think Fauci engineered covid so that Bill Gates could distribute his 5G chips. On the other side of the spectrum from Alex Jones are people who ALSO aren't able to 'figure out these rules and feel comfortable with them' ... rather than misinterpret the meaning of the game, they *fail to see the game at all*, and then fall victim to the same tribalist comfort of unthinkingly belonging to a team.
The middle, rational position, isn't to 'feel comfortable' with the game - that's way too much like not seeing it at all - but to simply realize that everyone, to some degree or other, is lying to you, *all the time*, and act accordingly.
I think this piece overstates the importance and implications of "FOX wouldn't make up the fact that there was an act of Islamic terrorism". Let's say that's true. But still, if they choose to report on 30 true acts of Islamic terrorism and choose not to report on 300 true acts of non-Islamic terrorism, what meaningful knowledge do you gain from correctly understanding that they wouldn't make up the 30 that they did report on? Same thing on the left, if CNN reports that there were 24 new laws that limit people's ability to vote, and they wouldn't and didn't make that up, but they also don't mention that there were 50 new laws that enhance people's ability to vote, what important thing have you learned from correctly understanding that they wouldn't outright lie about the 24?
Yep, I think that most of what is broadcast as national-level "news" is worse than useless for most people, and its undue prominence contributes to many problems. But such is the unfortunate reality of human brain than a dozen deaths (if dramatically presented, even better) is a tragedy and a million is a statistic, and in a fair competition for attention tragedy beats statistic every time.
There's another possibility: you think you understand the rules of the game but you don't really. I think this could well put you in the worst position of all. To be specific, I find it very hard to square the early statements on the possibility of a lab leak with the rule that scientists won't "flatly assert a clear specific fact which isn’t true".
As I take it that's not so much a hard rule as just an event which is very unlikely. Anyone will lie if they think there's a big enough benefit in it, even scientists. In the particular case of lab leaks and related matters I guess by coincidence both American and Chinese authorities found the possibility embarrassing and applied pressure to suppress the idea. This interacted in a strengthening manner with the ideology of Science™ among the usual suspects in the media, who were all too eager to denounce speculation as "mere craziness".
I agree, I think all the rules have fuzzy edges and the exact locations of the edges can change over time. But that makes it even *more* important not to trick yourself into trusting rules that don't actually apply.
Well, I would like to clarify a bit that what Scott meant was really a heuristic rather than a hard rule. Most of the time a scientist still won't make clear specific statements that are false so it's a good heuristic, since heuristic methods are not expected to work 100% of the time. It still feels easier to make use of relying on experts as a heuristic, while keeping track of exceptions such as "the scientist who said P lives in an authoritarian country & said country would find ~P embarrassing", since the alternative would be too cumbersome (I certainly don't have time, funding, or capability to inspect everything to my personal satisfaction, do you?).
Some suggested 2nd-order heuristics ("heuristics about heuristics") for when a scientist might say false stuff (there are probably more, feel free to suggest):
- the scientist might be pressured by an authoritarian country who finds the truth embarrassing, as discussed before
- there is some temporary crisis which could be worsened by public hysteria and authorities might credibly make use of the scientist's social status and esteem to minimize disorder & panic
- the specific false stuff is from a "cute" study widely reported in general media, and the scientist is currently holding a TED talk or being interviewed
- you may be a startup-er and the scientist is a stakeholder in one of your competitors
- the scientist is named "Euler" and you're an atheist hanging out at the Russian imperial court
The AEIR rebuttal has its share of seemingly calculated half-truths and insinuations. For example, "But Marx’s articles for [the New York Tribune] consisted of brief news summaries about the Crimean War, continental European politics, and piles of dry filler material about annual crop yields and industry reports. Only a small minority of these works ventured into something resembling a cohesive Marxian economic theory"—that last part is unsurprisingly true, since there wasn't yet such a thing—but his column seems characteristically to me unusually passionate and ideological. The author even generously links to those articles. Did he not expect anyone to check?
Generally deceipt as opposed to lying because they technically don't lie but still with the mens rea of giving people a false impression.
Most comments seem to focus on the deception part, but I don't feel it is the main driver here. It mainly seems to be a complex game where high status try and trip up low status people, so they can laugh at them as conspiracy theorists or people who don't understand the elite terminology.
The B) section is quite blatant once you penetrate , the Swedish government and Marxists are actually trying deceive for simple political purposes and that is just normal politics.
IMO Section II should have referenced Russian Collusion. This was incredibly corrosive to the nation's political discourse, was relentlessly pushed by WaPo for years, and was false.
I had a similar thought the other day, reading a tweet from Jesse Singal that he had "absolutely no fucking clue who to believe about anything Omicron-related" due to the public health officials having "beclowned" themselves.
A binary "to believe or not to believe" is a naive question. Instead, the question is "what information can I extract from this?" Public health officials tend to be paternalistic consequentialists (e.g. saying what they think we need to hear rather than what is the most truthful), will get more roasted for being wrong in one direction versus the other, and that they, like other humans, have an inflated sense of their own importance, virtue and correctness.
Through that lens, I understand (more or less) why public health officials have exaggerated certain risks (e.g. outdoor transmission), overstated the benefits of certain interventions (e.g. masks), flatly denied that there is evidence of efficacy for interventions with conflicting, generally positive but low-clinical-significance effect estimates (e.g. ivermectin), and have been painfully slow to update their guidance in the face of new evidence.
If a public health official says "yes some evidence suggests ivermectin has some efficacy, but if you're so damn interested in an efficacious intervention go get the god damned vaccine", some people will only hear "ivermectin is effective." Based on that statement, perhaps 1000 people who might have gotten the vaccine won't, thinking that ivermectin will ensure their health if they get Covid. More saliently, they open themselves up to the criticism of their paternalistic public health peers. So of course they don't want to say that. Is that irritating? Yes. Is it the right move? Not sure... it's hard/impossible to predict the consequences of people hearing "ivermectin is effective" vs the consequences of "yet another misleading statement from public health" if the evidence of efficacy is denied.
That isn't to get them off the hook. I dislike the paternalism of healthcare generally, as it negatively affects me personally and I suspect is an overall "inadequate equilibria" (which is to say, there is a better way). I also get why people distrust public health officials. But what I don't get is Jesse (a smart and perceptive dude) having "no fucking clue" who or what to believe. Or rather, I don't think his problem is actually epistemic - he *does* have a fucking clue. He's really just stating his objection to misleading statements. I am sympathetic, though ideally he wouldn't be broadcasting "we can't know what to believe!" when, IMO, we have enough information to triangulate on probable truths.
This is also reveals why many people should not trust public health authorities. Even if we are being generous and assume they are acting like utilitarians, they will still support policies that will kill 99,000 but save 100,000.
And if the 99,000 are overwhelmingly part of a certain group (e.g. young males), then that group is right not to trust them.
Regarding "the fake news that falsely claimed that Saddam Hussein was undertaking a major weapons of mass destruction program", then what exactly was it which in 1981 —in Operation Osirak— that the Israeli Air Force destroyed literally days before it was about to go critical (i.e., the nuclear reactor cores fired up, after which the fallout released from their destruction would be potentially devastating to the Iraqi civilian population)? Yet another totally Innocent aspirin + baby food factory?
I took the Weapons of Mass Destruction claim to refer to weapons that were functional at the time the claim was made, or were being built at that time.
> I’m not blaming the second type of person. Figuring-out-the-rules-of-the-game is a hard skill, not everybody has it. If you don’t have it, then universal distrust might be a safer strategy than universal credulity.
This is where I have a failure of empathy. Figuring-out-the-rules-of-the-game is an obviously important skill, and growing up I had it as part of the school curriculum three times before I was twelve. Sure, there's a long list of critical details about how journalistic sausage is made (ex: headlines are written by other people, Opinion articles exist in a different universe from fact-checkers, "editor" as a job title is meaningless, etc.), but at a base level I don't understand how someone lasts a decade on the internet without learning to parse articles for *what is actually being claimed*. Not the selling point, not the impressions, not the feeling it tries to leave you with, but the factual information it attempts to convey. (Or just as importantly - the lack of any such.)
I'm not up to writing it all out right now, but I'll caution against portraying this as a one-dimensional binary: the opposite of the savvy reader is not merely unskilled at parsing truth, but also actively disengaged. The epistemically dangerous territory is where people get burned by not understanding the rules, and instead turn to areas where not even those rules apply. (Yes, I'm talking about social media. Comments sections very much included.)
The combination of inability and lack of give-a-shit to:
1) read sources critically
2) understand the difference between what is said and what is meant
3) differentiate between intended and likely consequences
4) understand that not everything has a clear villain and hero
These are all things that are really, really common among the average to low IQ crowd that I think the higher IQ people who can and do engage in these behaviors have a really hard time modeling.
I think I want to double down as having a lack of empathy for insufficient give-a-shit rather than not understanding a lack of skill. I know *why* Joe Blow just reposted an article, EChoing Twitter commentary that's cleanly contradicted by the first paragraph. An assumption of charity doesn't stretch to the idea that he was haplessly duped by his previously-reliable friend who sent him the article, he just didn't *care* about its content beyond the value of signal-boosting the latest CW. That's probably in the same universe as the uncanny reader, but "I was fooled by the headline" is a shit excuse and blaming lying journalists rings hollow when literal children know better.
The response to this is many things. But one of the responses to this is that you assume that the liars do the same quality of lying all the time. It may be that politicians are willing to lie about tax increases by calling them something else, but not willing to lie by completely denying them. But it *also* may be that their willingness to lie varies depending on political whims. Remember "read my lips, no new taxes"? That was a more direct lie than just saying that taxes are revenue enhancements. But that wasn't how Bush typically lied, and indeed when he raised taxes, he actually claimed that he hadn't lied because the new taxes were revenue enhancements. He just switched from an unbounded lie to a bounded lie, while lying about the same thing, and using one of your own examples.
>The reason why there’s no giant petition signed by every respectable criminologist and criminological organization saying Swedish immigrants don’t commit more violent crime than natives is because experts aren’t quite biased enough to sign a transparently false statement... And that suggests to me that the fact that there is a petition like that signed by climatologists on anthropogenic global warming suggests that this position is actually true.
This is only true for trivial reasons: If the media or politicians believe they can get away with telling a lie, it can't, by definition, be transparently false.
But I'm pretty sure even you can think of cases where prominent officials or media came out in favor of something that was false, and known to be false at the time. You can look at a lot of statements about COVID for examples.
This doesn't seem to touch the thing that I assume that FOX News does to actually mislead people (I don't have FOX News, but I'm extrapolating from how the Sun and Daily Mail behave), which is that if there were over the course of the year 10 mass shootings, 9 by people with nice white sounding name, and 1 by Abdullah Abdul, they would not report the first 9 at all, and fill the airwaves for months with speculation about Abdullah's terrorism.
And "liberal" media TOTALLY does this lying-by-omission thing. There are lots of stories that I think are interesting and important that The Guardian doesn't cover because they reflect badly on "our" side.
[I really can't stand the tribalism of it all, hence the scare quotes, it's easier to use these reductive labels in a discussion but etc etc etc ]
Do you actually read the Sun or the Mail? I just ask because I've never seen them fail to report an important news event (and I try to look at all the front pages if I can*), regardless of the perpetrator. It's because their readership is probably a lot more interested in the event, the victims, the perpetrators, than it is in having a racist agenda pushed on it. So any sort of mass murder against the general public is front-page news, because it sells papers/brings in visitors. Their comment sections might rightly or wrongly focus on the tenth shooter, but even there that might be legitimate: to use a not-exactly-hypothetical example, if that shooter were a failed asylum seeker who had not been deported, should that not be a subject to be discussed despite the other 90% of recent mass shooters being locally born and bred? If you can cut mass shootings by 10% through immigration policy, should this not be discussed?
I suspect Fox is the same here: I've not heard that it fails to report major things (it seems to have reported that President Biden won the election for example), but it's editorial line may choose to emphasise some things over others for analysis and discussion.
As a side note, the Guardian has seemingly shifted it's editorial stance if late. You're actually not only getting stories about the push back to the cancelling of Kate Clanchy, but even opinion pieces in support of her, which suggests they think their readership are not militantly woke.
* If you'd said the Express ignored nine shootings by people with nice white names (my middle class upbringing suggests this excludes Waynes, Bradleys, Traceys etc) to talk about Diana or predict unusually bad weather instead, I'd have believed you on this basis mind you!
The difference is not in which event Fox News will be talking about the day it happens - all 10 shootings will get covered. But on which one they’ll still be talking about a week later (flip as needed for other outlets).
I mean, we saw this sort of thing very clearly with the synagogue hostage situation - some outlets were careful to only report on the perpetrator as a “British man” and were very credulous about claims that the crime was not targeted at Jews. But EVERYONE covered it.
Everyone covered it, and everyone forgot about it very quickly. If the perp had been a white supremacist then undoubtedly it would still be the top story on both CNN and Fox News, but he wasn't so we'll never hear about it again.
Agreed that what each outlet chooses to focus on after an event varies. That may not be a bad thing though, since all outlets will report the shootings, any arrests, the start of trials and convictions. This means that although the interpretation may skew towards particular views I think Scott is right to state there's a basic level of trust around the facts that can be shared by all news providers in your average democracy, whilst there are spaces in which different viewpoints can be examined. Whilst we might favour opinion pieces that blame the shootings on class repression, racism, excessive liberalisation of the law enforcement agencies, or lack of proper religion nowadays, depending on taste, and we have outlets that let us explore our preferences, at least we're confident that it's only at this second level that the basis for discussion changes. We are at least all discussing the same basic facts. In fact that's probably a reasonably good rule of thumb for conspiracy theories: if they are claiming all news outlets are ignoring something majorly newsworthy then it's probably (>95%) a conspiracy theory.
To be fair, the NYT and the WaPo are not exactly Fox News, and they too were parroting "unnamed highly placed sources" who insisted that Iraq was chock-a-block with WMDs, and later parroted other conspiracy theories in support of Empire.
I have similar contempt for them that I have for Julius Streicher and Alfred Rosenberg. And Fox, for the record.
I enjoyed the article, but I'm not quite sure of what the take-aways are, other than: "Don't believe everything you hear. Don't disbelieve everything you hear. Keep working on improving your map." Which is just par.
I tend to lean very hard on the "trusting the establishment" heuristic. Needless to say, it makes me a bit of a pariah in many circles. If we had a regime where more people actually read articles rather than headlines and at least skimmed the Wikipedia article on a subject before commenting on it, discussions on politics would improve exponentially.
On the other hand, I struggle with how much of this is because I might have skills or heuristics that other people just fundamentally don't have. And furthermore, where those heuristics come from. Did I learn them in school and could it be improved though education, or is it just an individual quirk of me and people like me?
I do remember that. At most though, it was like a one-day unit in freshman high school English. And this was an upper-middle-class high school in a major metropolitan area, for reference. Could also be a function of high reading/writing comprehension.
"I tend to lean very hard on the "trusting the establishment" heuristic."
"at least skimmed the Wikipedia article on a subject before commenting on it"
This is very revealing.
Wikipedia's entire schtick is to act like a neutral aggregator of information, and then systematically censor all non-establishment sources and opinions.
> I'm a liberal who doesn't trust FOX News, and sure, I believe it. The level on which FOX News is bad isn't the level where they invent mass shootings that never happened. They wouldn't use deepfakes or staged actors to fake something and then call it "live footage"
How do you know?
I am not asserting that this is true. I am not asserting that Fox news, or any other news outlet, does this.
What I am asserting is: _**IF**_ they did this, you would not be able to know.
So I am curious: What is the basis of your epistemology such that you can say with confidence that Fox News wouldn't make up video coverage of a shooting?
(And, lest you forget, I will remind you that similar falsifications have been documented repeatedly across all news agencies. https://www.nytimes.com/2019/10/14/business/media/turkey-syria-kentucky-gun-range.html is the most recent example I remember. Story: SPOOKY TURKISH TERRORISTS SHOOTING UP EVERYTHING. Video: a gun show in Kentucky. Why are you so certain that Fox would _never_ do this, when ABC did exactly the thing you are describing, three years ago?)
That last parenthetical seems like a very different scenario. Using the wrong footage (whether through incompetence, malice or just laziness) to illustrate an actual story is one thing.
But the reason I trust a big news network not to fake a whole mass shooting this is simply that it doesn't make sense for anyone's incentives to do so. Faking a whole live news story in the middle of New York is a huge undertaking with a budget of millions and hundreds of staff who would need to work on it. Even if it goes perfectly and nobody ever blabs and none of the other news networks figures out your subterfuge and nobody ever notices that all the victims were paid actors, the maximum benefit is rather small and the risk is enormous. What's the point? It's a lot easier to cover the real news than it is to stage totally fake events.
Of course in practice the left has never accused the right of faking a mass shooting, it's the other way around (Sandy Hook). It seems equally implausible in this case for the same reasons.
> Even if I learned of one case of them doing something like this once, I would think "wow that's crazy" and still not update to believing they did it all the time.
Well this is interesting and probably highlights a fundamental difference in epistemology between you and I
I assume everything is always at equilibrium, unless actively disrupted. That means, if I catch the news doing this _one time_, and I can't point at a unique and specific explanation for it, I will assume it's been happening all along and I only just noticed.
This seems so obviously correct to me that I'm curious as to how you can convince yourself that "oh it just happened once"
Your view would be correct under the assumption that your ability to notice things is low, and the variance in the news's untruthfulness is low. If you adjust those assumptions, it becomes reasonable to believe a single case is actually just an outlier.
> Or "FOX is against gun control, so if it was a white gun owner who did this shooting they would want to change the identity so it sounded like a Saudi terrorist".
They do this all the time, +/- a technicality.
Every news outlet aggressively suppresses demographic data when reporting on crime, if and only if that demographic data is not aligned with their narrative. See, for example, Coulter's Law. Sure, it's not technically lying, because they didn't present a false identification. But selectively hiding the identification when it's editorially convenient is no different
It is different because the not-technically-lying bit leaves you with *some* bits of accurate information. If they tell you that the shooter was white, you have high-confidence information that the shooter was white. If they don't mention the race of the shooter but do mention the race of the victim, then by Coulter's law you may have moderate confidence that the shooter was nonwhite. If the reporters were actually lying outright about that sort of fact, then you'd have *zero* bits of accurate information - CNN would always tell you that the shooter was a white Trumpist and Fox would always tell you he was a Saudi terrorist or whatever, and you'd have no way of knowing.
Unless you like leaving potentially useful bits of information lying around unclaimed, you might want to make use of the fact that reporters are generally "not technically lying" even if they do frequently omit things.
One additional complexity is that this heuristic of distrust is not symmetrical. What you say about FOX is true, but you couldn't turn that around and apply it to the New York Times. The two sides of the political aisle have different epistemologies.
The Right twists the truth because they see themselves as working in service of the Truth, or at least America. A signifiant contingent on the Left have disavowed both Truth and America. In their value system, truth claims are only a method for subverting some power structure or other, and are judged not primarily by accuracy but by how well they support whatever progressive narrative is in vogue.
> There are lines you can cross, and all that will happen is a bunch of people who complain about you all the time anyway will complain about you more. And there are other lines you don't cross, or else you'll be the center of a giant scandal and maybe get shut down.
Are you... are you watching the same country I am?
I don't know about Scott, but I'm watching the country where Dan Rather and Brian Williams lost their jobs for crossing those lines. The lines may not be drawn where you'd like them to be, and you may be outraged by what goes on on the other side of them, but they do exist and they mark a (somewhat convoluted) safe space for accurate information.
Ivermectin ... probably does some good. For most of us in the northern hemisphere, we don't carry much of a parasite load ... yes, carrying a parasite load is a thing. For others who live in wetter climates, where eating a tomato fresh off the vine can expose you to parasites, things are different. Probably all of us carry some slight parasite load, and setting those parasites back a bit, probably benefits us. Antihelmenthicides don't necessarily kill off all the parasites like a magick silver bullet, but just set them back a bit ... or maybe quite a bit based upon dosage. Does Ivermectin have other benefits? Yes, I read years ago—when I was a cowboy, and using a lot of ivermectin—treated populations in South America saw a reduction in certain types of cancers. Not that I ever intentionally treated myself ... but the form of ivermectin we used was the pour-on form. You have a bottle with an open chamber on top, you squeeze the bottle and the chamber on top fills with the desired dosage ... and ideally you pour the ivermectin in a stripe down the center of the animals back, just as you do with flea drops on your cat. But there you are with an open cup of ivermectin trying to pour it down the back of an eleven hundred pound animal that is scared, fighting, and doing a pretty good job of kicking your ass ... and things get a little wild, and you wonder who received the better part of that dosage. But back to Dr Malone ... who developend mRNA technology, who is fully vaxxed, and who works with alternative uses for existing meds ... and suddenly this guy is a pariah. Something is going wrong.
Now on to Climate Science ... all you have to do is read the Climate Gate files, and consider that anyone who says "whoa, lets think this through" and that person is labeled a science denier. Just last week we learned that climate change causes volcanic eruptions ... Ummm where are the adults in the room. And climate change causes floods, and droughts, and warming, and cooling, and everything is weird because of climate change ... and maybe the actual real changes due to climate change are so very slight that no one ever will feel it ... unless you're on the spectrum, and you can see CO2—a clear gas—in the air.
Here's the real problem with climate change. Global warming is causing polar ice and glaciers to melt. This increases sea level. Increasing warmth causes sea water expansion further increasing sea level, as a matter of fact, this causes acceleration in sea level rise. Great, we have something we can measure. Now go look at actual sea level rise data from NOAA. https://tidesandcurrents.noaa.gov/sltrends/sltrends_station.shtml?id=9414290
Its about 2mm per year, and as you can see, pretty flat and steady. According to the IPCC AR5, CO2 was not in sufficient concentration to affect the temperature until about 1950. But we see in San Francisco data that global warming caused sea level rise back in 1854, a full 100 years before CO2 could have done the job. So there's a lot going on that we're being lied to about.
> In the end, I stuck with my believe that ivermectin probably didn’t work, and Alexandros stuck with his belief that it probably did. I stuck with the opinion that it’s possible to extract non-zero useful information from the pronouncements of experts by knowing the rules of the lying-to-people game.
What do you do when you don't know those rules?
After all, you can't learn them by asking people; they might be lying to you
> The people who lack this skill entirely think it’s crazy to listen to experts about anything at all. They correctly point out time after time that they’ve lied or screwed up, then ask “so why do you believe them on ivermectin?” or “so why do you believe them on global warming?” My answer - which I don’t think is an obvious or easy answer, it’s a bold claim that could be wrong, is “I think I have a good sense of the dynamics here, how far people will bend the truth, and what it looks like when they do”. I realize this is playing with fire. But listening to experts is a powerful enough hack for finding the truth that it’s worth going pretty far to try to rescue it.
If I understand you correctly, you're essentially saying "trust the experts if and only if they say things you already independently know (or strongly suspect) to be true".
That's equivalent to saying "don't listen to experts, just listen to yourself".
So at least we both agree that we should not listen to experts, because they lie.
No, he's saying trust the experts only if you know the rules governing what the experts will and will not lie about and how. That's a different thing, a more easily learned and generally useful skill than e.g. specific understanding of climatology.
> And the clueless people need to realize that the savvy people aren’t always gullible, just more optimistic about their ability to extract signal from same.
After watching what you fascists did to my society in the name of 'health' last year, there is zero signal to extract and you're all just either hallucinating one, or brazenly appealing to authority to attempt to pull rank over me
Always looking for new people to add to my list of interpreters. Anyone brave enough to post some they've found? Joe Wisenthal in finance (odd lots podcast) is my favorite example of this.
To an extent. Here's some examples of tactics common among historians:
1) Quote someone approvingly, or build upon their work, etc. etc. while leaving out or brushing over their lies (or directly quoting the lies approvingly, but leaving just enough room so you can say they said it and not you). See for instance Vince Deloria and Red Earth, White Lies.
2) Make an unsupported statement that at the same time can't actually be disproved. Very common in stuff like art history, where you can make all sorts of claims about the author's intentions. My favourite example is a claim that there is a link between trains and atrocities like the Holocaust - not based on any claim that railways make committing atrocities easier, but rather based on some claim about passengers being essentially trapped on a train until it stops - again, not in any sense that the supposed link was because it made committing atrocities easier, but rather something psychological.
3) Just plain lying and hoping no one will notice. See Arming America.
A few hours late to the posting frenzy and not sure if anyone will read this, but this made me think a lot about my bounded distrust of science (I'm a biologist) and other peoples' trust in science. I think Scott's 5-HTTLPR post on the old blog summarized the dynamic in science beautifully. If something becomes a possible right answer, there will be an endless stream of small scale studies "proving" it, which is why there are a few hundred studies showing a wrong link between 5-HTTLPR genotype and depression. Likewise:
- It is zero surprise to me that there are a bunch of small scale studies showing that Ivermectin treated COVID and that these did not reproduce in the large TOGETHER study. The Alexander vs. Marinos vs. Katz argument about what small studies to include and exclude seemed almost meaningless. Infinite small studies can be wrong.
- I expect there will be hundreds of studies showing that Long Covid causes virtually all human medical conditions. There are some already and will be more in the coming years.
- There will be lots of studies showing neurological impairment or whatever from lockdown or being a child during the COVID period. These are also starting to emerge
- There will be a bunch of studies showing longterm vaccine side effects, although fewer than Long COVID positives because this is a less respectable right answer.
And of course most of these findings will be wrong.
And yet I don't "mistrust science". When Pfizer does a study on their vaccine I assume it's correct and I got vaxxed and boosted as soon as I could. There's a subtle difference here. If Pfizer were doing 50 small-scale studies on different vaccines I suppose I'd group them with the myriad Ivermectin studies. But holy crap, how does anyone unfamiliar with the subtleties of our information ecosystem believe anything biology produces anymore? You are constantly bombarded with our wrong studies and yet you still haven't burned our labs to the ground yet. Perhaps there's been enough unambiguous successes (polio vaccine, your children not dying all the time) that the general population is willing to forgive us for being wrong a lot, but I feel like COVID has exposed how much wrong crap we publish in a way that never really happened at this scale before.
This article puts a label on why I was so furious about the "who cares, politicians always lie, take him literally but not seriously" business during the Trump presidency - because it imagines that all lies are equal. If the New York Times writes a slightly misleading headline then you can't trust them at all, they're as good as the people who think the moon landings are fake.
If the Soviets say that the harvest is "glorious" when they really mean "good", that's still correlated with reality - it's not as good as they claimed, but you probably aren't going to starve this winter. If they announce that *every* harvest is glorious, and each fall has greater and greater surplus no matter how many people are starving in the streets, you get *zero* information. The lie is not connected to reality at all.
We can't force every publication to provide only perfectly accurate information, for many reasons. But we can put bounds on what sorts of lies are acceptable, and make sure that they don't stray too far from the truth. And that makes it important to (1) call out blatant lies, to prevent the boundary of acceptability from shifting, and (2) draw a distinction between calling something out as misleading vs false, to avoid communicating the idea that they're all equally lacking in information.
The flip side of this is that the media was using “Trump lies!!! He’s a dangerous aberration!” as justification for more or less totally throwing out their rule book on reporting with at least a veneer of objectivism rather than open advocacy. Part of “take him seriously, not literally” was pushback against media catastrophizing everything Trump said and acting like they had never been bullshitted by a politician before, despite every one of them having the exact sort of carefully tuned bullshit meter that Scott talks about in this post.
Boring linguistic point: Abd-ul-Allah, contracted to Abd-ul-lah, means Serves-the-oneGod. Abd-ul just means Serves-the. When you see the name Abdul, it's usually barrelled with a second element, like Abdul Rahman: Serves-the-Merciful. (The Merciful is also Allah. He seems to have a lot of names.) Abdullah Abdullah and Abdullah Abdulrahman would be fine names for a Saudi terrorist. Abdullah Abdulhussein would not, since Hussein (handsome one) isn't a name of Allah, and Salafi Muslims sniff at names that imply servitude of anyone but Allah.
Historians understand problematic issues related to "objectivity", and journalists are even more subject <sic> to these because of the temporal difference. A prior, clear statement of principles and beliefs has always been useful for evaluating historical or journalistic interpretations of data or evidence. If someone tells me where they stand then it really helps me with verifying (or falsifying!) their ideas.
Wow, did you realize when you published this that Alexandros had already quote-tweeted what looks like a smoking gun on IVM? Author of a meta-review seems to admit he knows IVM works, but that he plans to ease into that result over the next 6 weeks, allowing hundreds of thousands of needless deaths in the interim. https://twitter.com/alexandrosm/status/1486136274385702912
Quite the contrary . . . Hill seems quite biased towards trying to prove that ivermectin "works," even though he knows that without the fraudulent and biased studies, the overall evidence shows that it doesn't really work. It is completely absurd to think that ivermectin would save "hundreds of thousands" of lives in a few weeks . . . if ivermectin had that strong of an effect, it would be like Gleevec, and no one would be quibbling about the evidence.
On the autistic spectrum, this is the story of my life.
Everyone routinely states falsehoods, and I don't have the brain wiring to easily and intuitively figure out their intentions, including whether there's any intention to deceive.
I stated one in my second paragraph - somewhere, there's almost certainly some human who never states falsehoods, even if it's only a pre-verbal infant.
A high functioning autistic learns, often painfully, the specific rules that govern the statements made in their cultural niche, and what variations are common or possible. They then watch these deduced rules get violated in increasing numbers, conclude there's been yet another cultural shift, and work on deducing the new set of rules. (Or they move to a new sub-culture, and find the rules there unexpected, having been told (falsely) by non-autistics that their local rules are self-evident aspects of human nature.)
FWIW, at the moment I don't have a good sense of what falsehoods are acceptable in advertisements made in the US - beyond far too many. I don't have a good sense of what falsehoods are OK in support of political positions - is there *anything* Trump wouldn't say, if it benefitted him? And while I don't expect public figures among his political opponents to make *checkable* false claims about elections, or the numbers at a public event, that's about all I'm sure they wouldn't do.
Maybe the Soviet Union used coded language, as you suggest, preserving meaning once one learned the secret decoding key. That's certainly common in both resumes and job advertisements, with a few outliers producing entirely fake experience, and (more commonly) non-existent bait-and-switch job ads. There might be a similar way to decode advertisements and political speech, but AFAICT it's easier to make no checkable claims, beyond "this product is infinitely wonderful [in unspecified ways]" and "our political opponents are infinitely evil [attached to a list of lesser sins, often unverifiable]".
With the rules changing constantly, I'm not entirely sure it matters. Major mistrust makes sense, along with a heavy dose of epistemological uncertainty.
This and the EEG study arrived in my inbox 3 hours and 2 minutes apart, and this amuses me.
I like this post, I agree with it in theory, but in practice I kept saying "but I'm not sure you are making your point here." For example, the argument that Fox wouldn't report on a mass shooting event if it didn't happen while other news wouldn't report on there being no fraud if there was. These don't seem equivalent to me. One produces tangible bodies. The other produces anomalies in paper trails that due to our preference of anonymity over security are hard to validate. And I say that as someone who doesn't think there was (above normal background levels of) fraud in the election.
And this makes the proximity to the EEG post amusing. Because here is batch of articles with easily referenced evidence with holes ready to be poked in that yes a handful of experts I don't follow on a platform I don't use shot down, but it isn't like the usual suspects are going to substantially correct their news articles are they? What is the difference between yet another poverty/child development story and yet another there was no fraud story? At what point can you be confident you are actually threading that needle instead of just suffering from Gell-Mann Amnesia?
This reminds me of a post by Jacob years ago, something about how unless you are REALLY REALLY smart, taking the stupid route in a game is more effective than the slightly smart route, and it's hard to know which category you are in.
The best analogy is the legal system. Everyone understands that the lawyers are advocates for opposing sides and that they will be cherry-picking the facts and spinning the conclusions and inferences to be drawn from those facts. They are both "untrustworthy" as statements of the truth. But, the two arguments (and counterarguments to those arguments) are expected to include the relevant facts and analysis to get the truth. So the Judge has to winnow out the inadmissible and unreliable evidence and decide who is most persuasive.
Basically, you have to read the NYT as a plaintiff's brief in support of the woke left's case, and then go out and find the counterarguments and act like a judge.
Maybe tangential, but I’m not sure “everyone” gets this. At least not in the sense that “the lawyer’s job is to advocate for their client to the best of their ability, within the rules”. Can’t count how many times I’ve heard people be shocked and appalled that a lawyer might, say, question the credibility of a victim. That is literally his job!
people, your readers expecially (it seems), are scared to believe that they aren't tracking geo-politics. You aren't.
you pay attention, but what you pay attention to is a moving target reported by interested parties. It isn't what they report so much as what they don't/wont/ aren't permitted to report.
you can't spend you life pretending that knowing that believe that you are ABLE to track what is going on better enables you to actually do it. You can't, and it doesn't (in my opinion).
I am not a fan of corporate media, so you could easily dismiss this as an argument against certain sources of information..
Travel the world, notice the discrepensies.
ask koreans how much they actually think about north korea in everyday life.
ask an israeli citizn soldier what they KNOW/witness on the front lines (take with grain of salt of course, but ask ten!).
These facts you can hold to be self-evident.... we do not have control over what we do not know, and every "fact" we are willing to gobble up that is afforded us by interested parties can be, and often is, a tool used as a means futher an agenda.
Personally, I like to think of myself as a savvy conservative. I watch both liberal and conservative media, and I try to understand the biases and compensate for them.
However, personally I think there is a real problem with the way we do science. For example, if it's true that immigrants cause more crime (and it generally is), then scientists shouldn't be punished for saying that. Truth should be the ultimate defense against accusations of being a bigot. If immigrants do indeed cause more crime, then THEY should be the ones who suffer the consequences of that fact - not the scientists who simply point it out. Because ultimately, it's the behavior of the immigrants that needs to be corrected, not the behavior of the scientists, and that behavioral adjustment can't happen if scientists are attacked simply for pointing it out.
Part of the reason that I spread conspiracy theories to destabilize the status quo and bring about a new world order is that I believe a society where we are not allowed to talk about objective scientific data without being accused of heinous thoughtcrimes simply because we are "going against the narrative" is an evil society that deserves to die and be replaced by a society that has more respect for the truth. Do I genuinely believe most of the conspiracy theories I spread? No, of course not. But just as our elites are willing to harm innocent scientists for pushing an inconvenient narrative, I am willing to harm our elites in retaliation for them harming those scientists. If they push us, we push back. If they put one of us in the hospital, we put one of theirs in the morgue.
Institutions that are allergic to truth are evil garbage and the people in charge of those institutions or societies need to be cut down and replaced by any means necessary. If they don't like it, then they should start showing more respect for the truth and stop attacking people just for pointing out inconvenient facts that interfere with their desired narrative. We won't ever be able to eliminate tribalism until we're willing to hurt people for demonstrating that trait. And if we want to live in a high trust society, then we need to make truth our most sacred value. Without that, trust collapses and society falls apart.
As a good friend of mine once said "Aim high, but hit low."
*Who* is not allowing "us" to not talk about objective facts? I suggest it's the people in the mirror. It's we ourselves, in various tribes, that stick our fingers in our ears and shout down the voices we don't want to hear. Facebook and Youtube don't censor shit primarily because the Illuminati or George Soros insist upon it, but because that makes their platform more popular among a substantial demographic -- because the bulk of their own users demand it. People *love* to find a Judas goat and drive it out, it's one of our most favorite forms of social entertainment and team-building exercise. (One is tempted to recommend Shirley Jackson's immortal "The Lottery," or adduce the Aztecs ripping the hearts out of children to remind the Sun not to forget to rise.)
We all love to have "like" and "dislike" buttons on every act of intercourse we come across, so we can express our intentions. *We* demand the censorship of those voices we don't want to hear, and the people who earn their living tending to our communication wishes oblige, since it puts $$ in their pocket.
If you want it to be different, you need to build a stronger social mythology where hearing people saying (what to you seem) dickhead things is something with which you're expected to put up. It's curious that you think that way, care to elaborate why, preferably with factual observation? None of this crap about "oh we ought to be able to ban lies/disrespect/misdirection/rudeness," because it is just far too easy for the wolf of ideas censorship to creep in under the sheepskin of civility enforcement. That's why the First Amendment doesn't admit of any "hate speech" exemptions, our ancestors were less foolish than we are these days in that regard.
You make an excellent point, and I fully agree with you. The main force calling for censorship are the ignorant masses - the stupid narcissistic sheep who believe that anybody who disagrees with them is evil. How do we change that?
The answer is simple. The best way to turn something into a sacred value is to punish anybody who disrespects that value. For example, people currently respect the sacred value of diversity because you can get fired for disrespecting that value. Imagine if disrespecting the sacred value of truth was enough to get you killed? I bet you that our societies sacred values would shift from diversity to truth REAL fast.
You might say that the people pushing for censorship outnumber us. My counter argument is that we're much smarter than them, and there are a million ways for us to manipulate them into destroying themselves, or playing them against a superior force, or manipulating the electoral process so that we can elect political leaders who literally wipe out anybody who believes in censorship. Wolves should not fear sheep. We need to remind the censors and the cancellation mob what fear is. Currently you fear them, and that is the inverse of how it's supposed to work.
So, no censorship unless someone believes in censorship (as defined by you), at which point you should have the right to put their head on a spike. Fair play for all, save those who disagree, in which case you should be allowed to conduct total war against them. Is this the libertarian version of that malapropic formula of the Paradox of Tolerance?
Yeah I don't agree with that at all. Far as I can tell, this is the exact line of thinking that led from the French Revolution to The Terror: "let us just promptly cut the heads off of everyone who doesn't hold rigidly to the revolutionary principles. mon frere. Liberte, egalite, le guillotine!"
Not for me. Not intered in a police state, or a theocracy. Anyone who says "give me power so I can punish those who don't think correctly" is someone who automatically goes on my list of people to be exiled, come the revolution and me on the Committee of Public Safety.
I just have a really hard time believing in the "spread lies in order to create a society that tolerates truth" algorithm. It's certainly immoral, but I think it's probably also highly ineffective.
I also don't have a good solution or strategy to offer, though. Certainly amplifying alternative viewpoints and pushing back hard on the credibility of establishment sources seems necessary, but spreading things I know to be false seems counter-productive (and wrong).
It sounds like you're trying to categorize the types of conditions where biased agents are most likely to lie. Instead of doing this bottoms up, as you do with your examples, you should try tops down. You might come up with better buckets than I do below.
My belief is that transparency and access to the same data from multiple observers is where lying is least likely to occur (your example on shooting and the suspect) because it's falsifiable. When there isn't transparency and only a small number of people have access to the direct data (e.g., early 2020 information on natural vs. lab leak origins of COVID), lying should be the starting assumption. If the data isn't easily falsifiable ("sorry, that's a proprietary data set"), and someone has an incentive to lie, there's probably lying going on. Especially true if there is data sitting somewhere that intentionally isn't being made available.
This reminded me of this Military History video. It really made me think about what it would be like to grow up exclusively within a biased environment, and how hard it would be to recognize it - not recognizing the bias, but recognizing the scope.
Also on the dystopian scifi/fiction front, Kameron Hurley's "The Light Brigade". Even when you know someone's lying to you they can still fool you into believing a different lie.
I grew up in socialism, and we were taught that the system is perfect, but of course individuals are imperfect, and there is also active sabotage by enemies. So whenever you see something wrong, it is easy to assume it was either a sabotage or a mistake.
When you get lots of data, then you realize that the system itself must be broken, otherwise this would imply too many coincidences (statistically unlikely) or too many enemies (but why would a perfect system generate so many enemies?). But when you live in socialism, lots of data about its failures is precisely the type of information you are prevented from getting. And if you are in a position to get lots of data, you were probably already filtered for your loyalty to the regime no matter what.
Another way to get red-pilled is when a "mistake" is related to something you strongly care about, and when you naively assume it was an innocent mistake and keep trying to fix it, you are met with resistance that completely does not make sense in your model. Still difficult to generalize that the entire system is broken (not just one of its parts).
You've got to be impressed by a system that can get millions of people to *believe* that the system can be perfect although made of imperfect parts. That makes as much sense as thinking you can disassemble a Trabant and use the parts to build a Maserati. My working theory is that only the intellectuals were sufficiently disconnected from reality to swallow that laughable proposition.
This account of how less sophisticated people think fits my experience as an attorney in contract negotiations between companies and unsophisticated parties (such as the negotiation of an easement for a utility line across somebody's property). The problem is not that unsophisticated people are are credulous; it is that their suspicion is unfocused and random. They lack the basic skill of reading a contract and distinguishing between boilerplate terms and material terms. So they will sometimes fixate on completely innocuous terms that are not really up for negotiation and miss the places where they are expected to barter for better terms. (Hint: ask for more money.)
Similarly, my grandfather's dementia manifested as a paranoia about financial matters - which is apparently very common. Maddeningly, this fear actually made him more vulnerable to people selling dodgy financial products, who sold them as providing greater financial security.
I do think that the general public are well aware that educated people tend to hide their lies in equivocal language and the things that they don't quite say. This is why attorneys preparing for a jury trial look for expert witnesses who are willing to speak bluntly and stick to their answers under hostile questioning - they are more believable. The heuristic of only believing experts when they simply and bluntly works pretty well under most circumstances, but it is vulnerable to exploitation by con men.
I’m surprised you did not note one very big caveat, considering you yourself have written about it recently.
So, you will not hear any experts say “immigrants definitely do not commit more crime in Sweden”. This is, as you correctly note, a bridge too far.
But what WILL happen is that journalists and politicians will say something like “there is no scientific evidence that immigrants commit more crimes, only a racist would believe such garbage” and there will be deafening silence from the experts - no one will speak up and say “well yes technically that’s true but only because you’ve made it literally illegal to publish any such evidence”.
The IPCC will write a carefully researched, appropriately caveated report. NYT will dutifully report the worst case 3 sigma high end of the model as an inevitable catastrophe, AOC will use this looming doom as a justification for universal daycare, and CNN will collectively pretend that tornadoes never happened in Kansas before global warming. All of them will claim that they are “trusting the experts”. And from the “experts”? Crickets.
So the experts themselves don’t blatantly lie, but if they allow their expertise to be cited in the furtherance of a blatant lie, well, it amounts to the same thing.
Give me a break. If scientists took care to pen closely-argued op-eds in the popular press debunking every dumbass extrapolation or bogus interpretation of their work, they wouldn't have time to take a crap, let alone do real work. Obligatory SMBC:
People are like that. You discover the cool fact that the uranium-235 atom actually decays by fission and released a bunch of neutrons, and the politicians and generals rush off and build 20,000 1MT nuclear warheads on hair-trigger alert so everybody has to live under the shadow of instant vaporization for 45 years. Oops. You figure out how to send mail electronically, so it gets there in seconds and cost almost nothing, and entrepreneurs invent spam until 80% of Internet packets are junk. You write some brilliant networked-computing protocol, and right away some Russian gangs get to work exploiting it to build ransomware.
Since the work of almost any of us can be used for evil, maybe some kind of mutual agreement in which we always blame the sword-wielders instead of the blacksmiths or metallurgists would contribute better to social harmony and functional discourse?
That doesn’t work when experts DO take political stands, when they agree with them (e.g. epidemiologists praising BLM protests). If they ignore misrepresentations that they like, while jumping on misrepresentations they don’t (e.g. the Lancet letter about lab leak “conspiracy theories”) they lose the benefit of the doubt that they “just can’t be bothered with dealing with every misrepresentation”.
And no, they can’t go attack everyone who is wrong on Facebook, but when national level politicians are using exaggerations and misrepresentations to set policy - hell yeah I expect somebody to speak up.
Saying, “we’re just scientists, we can only do the science part, misinterpreting it or using it for evil is all on you guys” only works if they actually only do the science part. Once they start arguing for policy and taking activist positions, I think it’s fair game to question the things they choose to ignore.
Feel free to savage individual experts who take political stands. I'll be right there with you, throwing rocks -- er...assuming I agree the political stands are bullshit. I have no respect at all for a scientist who trades on his PhD to pretend to any more authority outside his area of expertise than Joe Sixpack. That's like "I'm not a doctor, but I play one on TV, and here's some medical advice..." Or like Hollywood actors on their 6th marriage lecturing the rest of us on morals. Contemptible.
I took issue with your sweeping statements about what "the experts" should *all* or *collectively* do to make sure their work is not mis-used. That's a bridge much too far. I believe in point-of-action individual responsibility, full stop. The guy who pulls the trigger is responsible for the murder -- not the gun, not the gun manufacturer, and not his mother for beating him twice a day every day from ages 6 through 16.
Yes. It is worth distinguishing between the clout chasers on social media who use their “expertise” (which may be something like has degree, is an associate professor) to boost their political activism or promote themselves as a brand, and the much larger group that is not engaging in such irresponsible behavior.
The activists and clout chasers absolutely harmed public trust, and it’s reasonable to assign them some portion of the blame on things like lack of vaccine uptake.
But it’s not reasonable to blame other experts for failing to silence them.
Yeah I know a few who should be shot, and I would definitely vote to revoke their guild privileges, were I still invited to the membership meetings of the Secret Brotherhood. But alas I was caught in the fornicatorium with one of the vestal virgins, so I'm no longer.
I don't think it's worth the effort to dig up examples, but there have been countless outright lies from "experts", just that I am aware of in the past 2 years, at least regarding public messaging.
The heuristic of "they will mislead, but not tell falsehoods about objectively untrue things" is unfortunately not accurate on too many subjects where the foundations of civil discourse and sensemaking have been thrown out the window in favor of what would be effective at achieving a person's goals.
"The savvy and the clueless" - and now tell the one from the other plus/or make the clueless accept that label. Hopeless: 1. Some anti-vaxxers are post-grads, reading lots, and mail you dozens of links for each of their points. (One colleague of mine). Many of those do not even look bad. 2. I quick-check the news on a mainstream website. Another colleague (second. edu. only) sees that and says: "Oh, that is such an obviuosly biased source!" 3. Matt Ridley, biologist and science-author: "At the time, given that I had written extensively on genomics, I was asked often about the chances that the pandemic started with a lab leak and I said this had been ruled out, pointing to the three articles in question. Only later, when I dug deeper, did I notice just how flimsy their arguments were." https://www.mattridley.co.uk/10784?button Titled: I WAS DUPED (btw. Scott Aaronson has a nice review of Ridleys new book https://scottaaronson.blog/?p=6183 "Briefly, I think that this is one of the most important books so far of the twenty-first century."
4. Who is savvy? Who is clueless? Why should anyone pay for a mainstream media, that has to be read as the "PRAVDA"? - Would the Marx/Lincoln tale have been written, if the author thought the WaPo cared for facts? Is the author banned now? Did the NYT fire the author of that Scott-hit-piece - or the whole board? As Scott wrote: I don’t want to accuse the New York Times of lying about me, exactly, but if they were truthful, it was in the same way as that famous movie review which describes the Wizard of Oz as: “Transported to a surreal landscape, a young girl kills the first person she meets and then teams up with three strangers to kill again.”
5. Nearly no one is evil (journalists are not, just not up to it), nearly everything is broken. Some will try to be "savvy" on Joe-Rogan-level - some turn to Scotts (Alexander, Aaronson, Summner 9 - some to Scotch. SLÀINTE MHATH'!
The uncomfortable truth is that - at an elementary level - statistics for journalists are simply numbers selectively used to validate a predefined narrative. Any applied use of statistics is simply not required to get followers. Certainly stats are too abstract to apply to "experts" who provide agreeable information. Journalists you read/hear from today don't have have to have had stat(s) training, or designed any study, defined assumptions, collected unbiased data from unbiased population(s), written null hypothesis', or ever been peer reviewed. The truth is that mulit-variate analysis would correctly provide answers to so many (Covid) questions but is so foreign and difficult that in trying to meet daily journalism deadlines such reporting is too difficult, boring and complex. Moreover, such analysis simply reduces what otherwise could be a truly "sizzling" headline supported by poor assumptions and little statistical accuracy. And, THAT my friends is how journalists make money - getting followers - NOT by publishing the truth. If you're reporting on a car crash, sea turtles on the beach, sports or weather, not stat knowledge required.
In the mainstream media, I feel that *actual lies* are rare enough that one should mostly not expect them even from less-trustworthy outlets. I think Fox News tends to be very misleading, but I also think that's a result of topic and perspective selection, mostly not actual lies outside of some occasional non-host lies which go less challenged than they perhaps should be.
The WaPo story is common enough on the left end of the spectrum, which is where my bias lies... Not a blatant attempt to mislead, exactly, but involving some pretty big assumptions or leaps that make me raise an eyebrow. This is the sort of thing I've come to expect from political media I agree *and* disagree with, just as sort of a cost of doing business of engaging with political commentary.
In science, however, I'm getting *much more* open to making the assumption that a given researcher is a big damn liar. There have been way too many studies in the last several years, esp wrt COVID, that simply cannot be the result of good, honest scientific effort. This has to be true whether you're on the mainstream or skeptical side of the various COVID arguments; SOME of the research has to be bullshit.
I've actually been having this conversation with my younger sister. She's gotten increasingly conspiracy-minded in the last few years and I keep trying to tell her that *yes*, I am aware this or that group is not entirely honest but that is not the same thing as fabricating a new reality entirely from whole cloth. There is only so *far* you can stretch the truth before ~nobody believes you.
It sounds like you're prejudiced against conspiracy theories. A conspiracy theory is just like any other type of theory. Conspiring is one of the most fundamental human behaviors.
Yeah but ratting out your fellow conspirators is also one of the most fundamental human behaviors. Nobody has a hard time believing in visible conspiracies, e.g. political parties, PACs, interest groups, et cetera. It's the part where we're asked to believe that chatty human beings can get together in groups of thousands to hundreds of thousands and *nobody* ever spills the beans for his personal moment of glory. That's *not* human nature.
I totally believe in the Galactic Federation. It's the part where he claims various Earthling politicians are in the trust, and pay, of the Federation that I find highly dubious. What self-respecting BEM would hire human beings onto his Agile team? That's like Apple hiring bonobos to do the iPhone reveal.
Closely related: one of my rules of thumb over the past 5-10 years has been that if I see a complex, intelligent-sounding argument against a dominant narrative which itself seems to be completely ignored (no attempts at a rebuttal to it at least), then that contrarian argument is probably largely valid.
Counterpoint: there are good arguments that such contrarian arguments are following a sort of "just asking questions" routine where they just throw stuff out there for the purposes of poking holes in the dominant narrative, and that it's a waste of time to try rebutting them on the grounds that it's easier to start a bunch of fires than to put them out, etc. (e.g. see Sam Harris' attitude towards anti-vaxxers such as Bret Weinstein), and that this explains why certain arguments aren't getting engaged with.
To your counterpoint - usually this sort of thing can be dismissed with “even if I admit all these facts, does it actually disprove the premise?” Because usually “hole poking” type arguments end with a big leap from the facts to something like “and therefore we know these guys are wrong/lying about EVERYTHING”, and without that leap the holes aren’t all that compelling.
I agree that journalists, experts, politicians, Very Serious People, etc. largely follow the rules of their game when communicating. I don't agree that this makes their communication trustworthy in the sense Scott has in mind, where the savvy can extract a similar (though maybe weaker) signal to what we'd get if we had the unvarnished truth.
The problem is selection effects. To take an example that Scott has mentioned before: how much of a difference in, say, New York Times reporting would we expect to observe if the number of unarmed black people killed by US police in a year was 10 vs 100 vs 1,000 vs 10,000? Surely it would be dwarfed by the difference caused by (let's charitably call them) "consensus effects", where the NYT gauges the importance / tone / narrative of the subject in public discussion (especially elite discussion) and adjusts its coverage accordingly? Yet for any single article it's near-impossible to tell how much it's being driven by consensus vs reality. You'd have to undertake a dedicated long-term research project to extract the "good vs glorious" kind of signal.
Similarly, imagine the concerns of the liberal watching FOX News in the Yankee Stadium hypothetical. They could agree that FOX was reporting facts while still being deeply concerned that its framing and emphasis were calculated to stir up Islamophobia, and that it wouldn't have covered an attack by a white American shooter the same way. And they would have a point.
In the case of news organizations at least, we can confidently extract valid signal from them about certain kinds of ground-level facts, but 1) we rely on them as much or more for *analysis*-- an overall picture of what's going on and why-- and 2) they have almost unlimited influence over *what facts to show us* and have shown willingness to use that influence in the service of their preferred narratives. The fact that they follow a set of rules requiring them to be honest about the ground-level facts they do report is a very weak constraint in this context. How much signal about the true picture of the world could actually be available from it?
This is why I think the NY Times gradually eroding their reliability is a serious issue.
If I have proof of some government wrongdoing—something so outlandish that sounds like a conspiracy theory (Gulf of Tonkin incident faked! FBI tried to blackmail MLK into committing suicide! CIA proposed killing Americans and blaming Cuba!)—I want to take it to an outlet that:
1. Has the resources to investigate and verify my claims.
2. Has the resources to protect their sources.
3. Has the reputation such that if they publish it, *everyone* takes notice and takes it seriously.
For the longest time, I think that was the NY Times more than any other outlet.
If Jacobin publishes this, we'll all roll our eyes and say "okay, sure". If NY Times publishes it, we all agree to take it seriously.
Except now...some people won't, and that's reasonable, for the reasons Scott detailed. NY Times isn't likely to make something like that up, but they're not a bastion of truth. They're merely a bastion of truth-when-it-really-matters. Someone inclined to doubt the story will simply point to all the times the NY Times has misled readers, maybe intentionally or maybe merely negligently.
Whenever NYT or WaPo eats away at their reputation, I'm reminded of what they could be, what we need them to be, and what they arguably were for a long time: widely respected and trusted.
We have no "paper of record" for investigative reporting. I have a good amount of trust in ProPublica, but they lack the necessary name recognition. I don't think anyone's filling that void. It's a really hard void to fill.
I strongly agree with this post. It's a colossal loss for our nation, certainly related as both cause and effect to other colossal losses. It remains to be seen how bad the consequences will be.
A good example of this is Iranian state TV. They're mostly not a trustable source on most topics, but then again, they won't just lie about everything without any rules to their game. It is a critical survival skill to be able to draw those lines mentally.
Well, the writer is expert at one thing--covering up for experts who lie. But, as Dan Quale, a mental patient, says, No one is fooled. By the way, who make up the majority of mental patients? I ain't going to tell.
This seems neither kind (since its accusing Scott of covering up lies) nor clearly true (since it doesn't even attempt to provide proof for any statements). I like when the comments section is held to a higher standard than Marginal Revolution's.
It wasn't meant to be kind, it was meant to be an observation of the 'bias the author blatantly showed against common people, in favor of 'experts.' As another commentator noted, the author demonstrated distain for for common folk and it's not worth the effort do a point by point analysis to critique him.
They (probably) mean liberals. There has been some headlines I have seen that claim democrats or liberals (choose your group) suffer more from mental illnesses than conservative people. I haven't read the primary research but the claim seems very ripe for confounders and I am skeptical that you could isolate the variables enough to make a rigorous claim.
I see, thank you for clearing this up. It seems that conservative media latched onto a Pew survey from last year which found elevated rates of self-reported mental illness among democrats. Many good reasons to take this result with a grain of salt, like you said, but here are the alleged rates:
This was a great article! One of the best in a long time.
Here's an anecdote about an experiment in which I debias my news. Last Lent I stopped reading the news, by which I mean I stopped typing NYT, Wash Po, NR, Jacobin, Drudge, or anything like that into my search bar. I feared developing an even more misleading picture of the world because of the selection bias for news stories. Head over to the Washington Times and have your attention yanked into thinking about something that you didn't choose to think about and is likely not going to instruct, deepen, or delight you.
Instead I tried to use that saved time to read books.
It somewhat worked; I read more books in the past year than in any other past year. But as I read less news my substack subscriptions went up, and now I get slightly more substack articles than I have time to read. Still, this is higher quality reading by and large. And very few substack articles I read are biased in a way I can't easily control for, for with substack I know the author's interests, values, and outlook. Is this the coward's path? It's not bias that worries me; it's that even after controlling for it, what's left is vacuous.
Books and substacks and personal emails. I don't know that I am missing anything, if I don't google the news. But would like to hear a defense of reading and checking the news headlines.
Re climatologists: When ~95% of climatologists agree on something, I think preference falsification. And the push towards preference falsification is obvious. To disagree is not to get funding. I've mostly lost faith in any climate thing I hear, first it's distorted some by the media, and then also perhaps distorted by the scientists. Mind you I can see the local climate is warming, and I'm down with more CO2 as part of the cause. (you can replace 'part' with 50% or more.)
1.) Given preference falsification, and obvious conflicts of interest, why should I believe climatologists? (I did follow Richard Muller at Berkley for a while, he seemed credible. https://physics.berkeley.edu/people/faculty/richard-muller)
2.) I would like to see some talk about possible good from global warming.
a.) longer growing season and more rain is pretty good for agriculture here in the Northeast where I live. (longer corn season, at a personal level.) Won't most temperate zones benefit from warming?
b.) How do you balance warming against the threat of the next ice age? (And why is the next ice age never talked about as a threat?)
I feel like I'm quite a sophisticated consumer of information. Over the past ten years, I've felt the level of sophistication that I'm *using* to sort out truth from error in the news going up and up and up (sort of the way you can feel your mathematics ability being stretched when you do harder problems).
The more I feel myself having to flex my news-consumption muscles, the more I think "We're all screwed, there's no way this doesn't end badly." Nothing about the past five years has made that seem wrong.
(On a tangential point: I followed both the primary sources and the media coverage of the Kavanaugh hearings very closely. Many of my priors about what the media would or wouldn't do were destroyed. I now think it's really quite a minefield, parsing the average media account of anything.)
Great post, by the way. As usual, you put things better than I could have by a lot.
Yeah so most scientists don't directly lie but instead just "bend the truth". Others absolutely do lie. Especially the ones with more political roles related to covid. See also the leaked Fauci emails.
As for the media: they don't directly lie if they think they will get caught. This is for most stories, so it is mostly fine.
Why is the icon for this a picture of the red square? The Russian media has not been particularly reliable for at least the last century, and I think this *does* include clear, explicit lying.
I think a lot of this is built on the idea that conspiracy theory believers are people who don't have good "signal decoding skills", burned themselves by trusting the experts, and are now mistrustful towards everyone. That is the exact opposite of truth, though. Conspiracy theory believers have their own "experts" and trust them far more unquestioningly and for far more extraordinary claims than progressive WaPo readers trust WaPo's history column. There is a spectrum from providing the most accurate and unbiased information to playing at the reader's prejudices and providing them with what they want to hear, with no care to accuracy whatsoever. The Washington Post is, in the big scheme of things, towards the accurate end (although it could certainly be closer). Towards the other end you have things like Infowars, Alex Jones, Wonkette etc. I'm sure there are people who distrust the full range, but 99% of the anti-establishment types are simply people who don't want reality to get in the way of their emotional fulfillment, and prefer news sources which tell them they are right and on the right side 100% of the time, provide them with a steady stream of outrage bait, etc. To suggest that relatively-accurate media is at fault for the existence of those people seems very naive to me.
"news sources which tell them they are right and on the right side 100% of the time, provide them with a steady stream of outrage bait"
WHOA, that sounds like a conspiracy theory! You think people are... conspiring... to provide outrage bait?! What are you, a conspiracy theory believer?
Anyway, you seem to be missing the point. Alex Jones throws out 10 crazy theories a day. His listeners don't just "believe" in what he says in the same way a WaPo reader believes in what they read. There is an asymmetry here: NYT/WaPo/etc. all take on the default establishment position, and then anti-establishment figures poke holes and spread doubt about those positions. NYT readers truly accept what they read as facts coming from experts, and they see these facts repeated everywhere throughout the mainstream media, confirming their beliefs. Alex Jones is up against the mainstream narrative and most of his coverage is framed around discussing and responding to it. What people get out of an Alex Jones rant is not "Yes, I truly and absolutely believe that they're turning the frogs gay", it's "Wow, the media is distorting and lying about a lot of stuff, I shouldn't trust them." The disinformation WaPo believers believe is far more pernicious because they truly believe it with all their heart, as it has been ingrained so deeply into them by a supposed consensus.
Also, your position that mainstream media is "relatively accurate" signals some severe cognitive dissonance. Being 90%, 99%, or even 99.9% accurate is not very good at all - it's quite easy to just *not lie*. The standard should be 100%.
Wrt misleading communication / suppression by experts and other "establishment" people, I think the more interesting way to look at that is the conflict between a virtue ethics framework where you are supposed to convey your best and most nuanced interpretation of the truth, and consequences be damned, vs. a consequentialist framework where experts are aware that some results will be used as propaganda fodder to support false claims, and censorship / misleading communication might well result in less lies and misguided beliefs overall. Like, studies showing that immigrants are overrepresented in violent crime will be used to convince a significant fraction of the population that every single immigrant is a violent criminal, and build a political movement on that lie, so better not give them the tools even if there is nothing wrong with the tool in itself. It's a kind of disinformation arms race - it's hard to stay honest if the other side can lie with abandon and there are no norms or laws punishing liars. You'll just get outcompeted eventually.
"2. If you just do a purely mechanical analysis of the ivermectin studies, eg the usual meta-analytic methods, it works."
Why you agreed to this I have no clue. The usual methods are to pool like studies with like studies and like outcomes with like outcomes. When you do this, and do a purely mechanical analysis of ivermectin studies, using the usual meta-analytic methods, it's not clear that "it works". And that's not just for death, but numerous secondary outcomes as well.
Scott! We gotta stop meeting like this! Anyway, I wrote a little something on Twitter (https://twitter.com/alexandrosM/status/1486473068356591618?t=B4yxR4p4ax4bampZAlBkGQ&s=19), I hope to organize my thoughts into a full-length response, but if I had a wish it would be to devote enough time to the conversation so we can agree on what we agree on, and what we disagree on, and why. I am sensing an urge to rise to the abstract and produce an omni-explanation, but this terrain is not friendly like that. Building a position requires in-depth work, and being willing to hold an agnostic stance.
If I have just one request, however, it would be to retire or taboo words like "conspiracy theorist". It seems to function as an easy way to signal someone is wrong without doing the work to demonstrate it, and it's kind of concerning to see leading lights of the rationalist community fling words like these around, as if we haven't all read "semantic stopsigns" and the like.
I know this is a long distance to cross, but in 2022, when we know there was coordination of experts to suppress the lab leak hypothesis, there was coordination of experts to suppress the Great Barrington declaration, the experts were clearly wrong on overstating both the safety and efficacy of the covid vaccines, and masks, and school closures, well, maybe there's something missing from our model and we need to stop climbing into ever higher ground, and start double-checking everyone's work, seriously considering they may have actually gotten even more things completely wrong.
Bounded Distrust is also how I feel about the CDC and I can see why it sounds absurd people who don't have their distrust delineated the same way I do.
When the CDC says "Hey, we have some recommendations!" my reaction is "the only reason your recommendations matter is because some people are dumb enough to still listen to them." I repeated CDC talking points as late as May 2020 and I feel like an idiot for it. At this point, it feels inexcusable to still be non-critically repeating their recommendations and talking points.
On the other hand, when the CDC says "Hey, check out this cool new data on vaccine efficacy!" my reaction is "Oh, hey, neat-o! Hey, everybody, look at this data, we can trust it because it's from a highly-reliable source: The CDC!"
Because while the CDC will say some bonkers things, fail in astounding ways, and make shockingly misleading statements, as far as I can tell they have yet to falsify data. It's a different category of unreliability than what the CDC engages in.
Without this explanation/understanding, though, my differing reactions sound insane.
The FDA's EUA analysis, even assuming 95% vaccine efficacy, demonstrated that the risks of the vaccine for young males were comparable to (or even slightly riskier than) Covid, and then they approved it anyway. And then the same happened with boosters.
Some people in the committee actually resigned over this, which just means the same thing will continue to happen. So it doesn't really matter whether they falsify data or not - their conclusions are almost completely unrelated to the data.
Call me cynical, but I would not be at all surprised to learn that a major news organization faked a terrorist attack -- or, at least, reported on a routine mugging as though it was a terrorist attack.
A way to do what ? If you mean, "to report on a mugging as though it was a terrorist attack", then it's pretty easy: "According to some eyewitnesses, the attacker, identified as John Smith, walked into the grocery store shouting racial slurs and brandishing a weapon. We now take you to our criminological expert, Dr. McFakeGuy, who will explain how he was able to deduce the attacker's exact rank in the KKK based on his body language..."
> The reason why there’s no giant petition signed by every respectable criminologist and criminological organization saying Swedish immigrants don’t commit more violent crime than natives is because experts aren’t quite biased enough to sign a transparently false statement - even when other elites will push that statement through other means. And that suggests to me that the fact that there is a petition like that signed by climatologists on anthropogenic global warming suggests that this position is actually true.
So, the obvious counterexample is of course American treatment of race. You don't see anyone out there arguing that blacks commit no more crime than whites. But you see a very large number of people arguing with a straight face that blacks are no dumber than whites. One statement is no less transparently false than the other.
(You also see many, many "experts" telling everyone with a straight face that Race Does Not Exist, so black and white are not meaningful categories at all. The implications are never explored and "nobody" takes the claim seriously, but it has plenty of public signatories.)
I would argue that the difference is the amount of elite investment in pushing their preferred claim from behind the scenes. There is not enough cover -- yet -- for people to claim in public that blacks don't commit any extra crime. But cover fire has been laid down for the claim that blacks are no dumber than anyone else. And so there are many public signatories to that claim despite the fact that it is transparently false.
(Connecting back to global warming, we can apply my heuristic to ask how much investment has been made in the ability to make claims whether or not those claims are true, and use the answer to inform how trustworthy we think the claims that actually are made are.)
>Now suppose FOX says that police have apprehended a suspect, a Saudi immigrant named Abdullah Abdul. They show footage from a press conference where the police are talking about this. Do you believe them?
If our positions were reversed, and I was Scott and Scott was a reader, and if Fox actually lied in this way, I could easily come up with excuses as to why this doesn't count. For instance, maybe Fox got bad information and they honestly thought the police apprehended someone but were mistaken. Maybe they showed footage from the wrong conference not to deceive, but because any conference pretty much looks like any other and it's not deceit to find something that gives a good visual representation of the kind of thing alleged to have happened; they're just being overenthusiastic. Or maybe I can just say that the audience was intended to figure out the lie, since they can't fail to know Fox News's reputation.
In other words, under Scott's standards, it's too easy to make excuses and say that even a blatant act of dishonesty doesn't disprove his claim. It's unfalsifiable.
The media said that Kyle Rittenhouse killed unarmed black men. That's blatantly false. Other examples include Trump conspiring with Russia and almost anything the government and media have said for Covid that is now out of favor. By any sane standards, they have lied in the way Scott says they don't, but I'm sure they could be explained away.
For that matter, the New York Times called Scott a white supremacist. Scott said "This seems like a weirdly brazen type of falsehood for a major newspaper." Yet the Times did this all by making insinuations and saying literally accurate but misleading things, so by Scott's current standards, brazen has become not-brazen.
I get the point to the article and I think I agree. I feel like I'm one of those that reads the tea leaves pretty well in both science and politics. That said, two things pop into my mind and I don't know where to fit them with this attitude.
First, I was taking a college course in the Soviet Union during the first Gulf War. Our class was staying at the Kosmos Hotel in Moscow. It was run by the Soviet state tourism agency and it did not have CNN at the time. Everyone gathered in the hotel bar on the night the war started to watch the Soviet newscast. Russian friends translated the newscast and told us that the news said Iraq had shot down over 30 US planes and the war was a disaster for the US. We heard many other things that turned out to be absolute lies. It wasn't FOX but the experience of having a news channel not only shade but outright lie about objective facts never left me
Second, we have seen many instances of outright scientific fraud. We have an insane amount of major studies not being able to be confirmed. Finally, we have pretty good recent evidence of political interference in controversial topics. I feel like I still trust the experts, but any time a topic is in anyway controversial, I get a squick feeling in my stomach.
Is it possible to watch somewhere North Korea's news with English subtitles? Especially the parts about things happening outside of North Korea. That could be an interesting experience.
I think an important issue in the current media environment is that a third category, "analysis," has become more prevalent and fits between the traditional dichotomy of "information" and "opinion". Anecdotally, I think when people criticize a news outlet for bias, it is often because they are treating an article as information when it is really analysis.
"Information" would be the kind of clear-as-day facts of the kind you mentioned. "The stock market went up today," "The President said x during his speech," "Abraham Lincoln was assassinated in 1865", etc. Of course some bias can be introduced by a publication in terms of what to cover or whose quotes to use, but the base facts are true.
"Analysis" also includes facts, but offers the author's interpretation of them. To me, the Lincoln articles you linked to are great examples of this. Just because the Washington Post article does not appear on the opinion page does not mean that it should be treated as information. The Post author and the rebuttal were using the same pool of facts but analytically came to different analytic conclusions. If someone finds the rebuttal more persuasive it doesn't mean that the Post was presenting inaccurate information, but rather you disagree with the author's analysis.
I think one of the most prominent examples today of this difference is the treatment of how/whether the 1619 Project should be used in schools. At the extremes it's considered gospel or heresy (when I would consider it historical analysis), but to me its best use in the classroom would be as a tool for debate, especially if paired with another publication that used the same facts but came to different conclusions.
I think when I try to be savvy, *I* end up sounding like the conspiracy theorist, and I think the article is missing the fact that this perception runs both ways.
For example, I generally assume any item on the nightly news about a pharmaceutical is either a native ad, or the result of a lazy reporter filling time off a press release. If Drug X really marked a turning point in the fight against cancer or dementia, you'd know it, there would be a giant cultural event surrounding this and everybody would be acting a bit differently in their presentation.
But it seems that most people believe these stories are intended to be factual, and when I tell people that pharma stories on tv are basically advertisements and you aren't supposed to believe them, they act like I'm the conspiracy theorist calling it "fake news". By your framing, their alternative to being savvy about this is to conclude that all of these claims were intended as literally true (as framed, even if the text is filled with qualifiers) and that in fact these companies are rampant liars who cannot be trusted at all -- which is what you're calling the "conspiracy theorists". But from their POV "this particular set of people are chronic liars" or "this particular type of claim is always nonsense" isn't a wild conspiracy, whereas the "savvy" person's attempt to play Kremlinologist with everything anyone says looks much loonier.
> But: have you ever heard an expert say, in so many words, that immigrants to Sweden definitely don't commit more crime than natives?
I've heard something structurally and truth-value-wise similar. I've heard:
1. Travel restrictions and gathering limits are racist scaremongering which has absolutely no effect on disease spread.
2. It will take two weeks to bend the curve and stop the spread of the pandemics.
3. Masks are only necessary to medical professionals and are useless to common people.
4. Mass gatherings and protests are good for public health and do not cause any concern even in the middle of a pandemics, as long as the issue being protested against is racism.
5. There's absolutely no scientific or at all plausible basis to the idea the coronavirus has originated in a research lab in Wuhan, it is a baseless (and racist) conspiracy theory that no real scientist or expert ever agreed with, and the scientific consensus is firmly and entirely on the side of the proven fact that it originated from an animal source without any involvement of the Wuhan labs.
6. Inflation is a sign of a very healthy economics and is very good to everybody but billionaires.
Of course, some of those had less "experts" than others to state them, and they had different lifetimes. But all of them were said in public, and as far as I know, none of the "experts" saying them were publicly shamed and officially stripped of their "expert" status forever and forced to wear the "dunce" caps. So yes, I think at least some experts would absolutely go on TV and say any lie they want, and I, usually, have absolutely no ways to tell this kind of experts from any other.
I must object to the "conspiracy theorist" classification here. We're way past "conspiracy". People lying to me on TV all day long aren't "conspiring" - they are doing their business in the open, brazenly and boldly. They are in power, they are putting the metaphorical boot on the face of the truth, and while they still fail to keep it there forever, they certainly keep trying. I don't know how to call it but it's not a "conspiracy".
Yes, we should distinguish between "secret conspiracies" and "obfuscation conspiracies".
The argument against "secret conspiracies" is that it is very difficult for a large number of participants to keep a secret. Sooner or later, someone will change their mind; and they can leak the message anonymously. Also, the outsiders who are curious about something may figure it out.
But "obfuscation conspiracies" just make something complicated, and even if the message is leaked or someone figures it out, most people won't understand it, so you just need to *deny* its translation into plain language. So people will be like: "oh, great, there is no poison in the water, there is just some molecular contamination of dihydrogen monoxide, some expert chemistry stuff, nothing that we ordinary folks need to worry about, right?"
The rationalists have a bias here - the entire rationalist exercise collapses when basic facts and data in mainstream academic papers and the New York Times cannot be taken at least somewhat at face value. When these institutions are subverted to political ends, which I posit they have been, the types of conversations that rationalists enjoy having become impossible. This is why people like Scott and Sam Harris are reluctant to drift too far from the establishment narratives around things like Covid. Scott's writing about ivermectin is engaging and I appreciate it, but it always comes across as being driven by motivated reasoning. Because if the establishment view on ivermectin is a corrupt lie (which I posit it is), then we are in a world where nearly unbounded distrust becomes appropriate, and rationalist-type conversations become impossible. We move from the world of facts and data into the world of mythology and spiritualism. Which I posit is the only rational world to inhabit these days.
Unless, of course, ivermectin actually doesn't cure covid. Hypothetically speaking.
Then, I guess, the proper thing to distrust might be people on internet tirelessly expressing strong opinions about things outside their expertise which they learned on internet from other people tirelessly expressing their strong opinions, etc.
Ultimately, whether ivermectin works is orthogonal to my point. My point is that Scott's ivermectin analysis appears to have originated from a pro-establishment bias, and appears to have been the result of motivated reasoning rather than scientific inquiry. My expertise is in motivated reasoning. I'm a lawyer. Motivated reasoning is what I do. In the case of ivermectin, Scott was presented with data that showed ivermectin worked and establishment experts saying it did not. He really wanted the establishment experts to be right. He thus searched for a possible confounder (worms!) and threw it out there, despite no studies or good data supporting this hypothesis.
The worms confounder is the sort of thing a criminal defense lawyer would come up with to raise reasonable doubt against a strong case by the prosecution. It is not the sort of thing one would come up with by taking an unbiased look at the data. An unbiased look at the data creates at least a presumption that ivermectin reduces severity of illness. Actual good evidence should be required to overcome that presumption. Scott had no such evidence to support his theory. He just had an unsupported rationalization to back up the establishment experts. Because if the establishment sense-making apparati are completely corrupt and unreliable, as opposed to merely being deserving of "bounded distrust," the rationalist sense-making enterprise becomes a fool's errand.
I don’t think this fairly describes the situation with the ivermectin studies though. From an early part of Scott’s ivermectin post:
“Of studies that included any of the endpoints I recorded, ivermectin had a statistically significant effect on the endpoint 13 times, and failed to reach significance 8 times. Of studies that named a specific primary endpoint, 9 found ivermectin affected it significantly, and 12 found it didn’t.”
This sounds like when you look at ivermectin studies in a way that limits potential p-hacking, about half the studies say it has a significant effect and half say it doesn’t. Which is weird! If it did absolutely nothing, we’d expect an overwhelming majority of studies to find no positive effect. If it treats covid, we’d expect… I think better than 50/50 in studies? This is genuinely weird. The existence of some sort of confounder makes a lot of sense. Maybe the confounder is worms. Or maybe it’s something like diet, and the drug works on covid in the presence/absence of some compound that has regional variability in human consumption. Or maybe its effect size really does sit precisely at the limit of our ability to detect, so that by chance alone it shows up as significant half the time. If that is the case though, then this becomes less important going forward as covid-targeted antivirals become approved and mass produced.
For examples on this blog where this sort of analysis results in Scott deciding that the establishment conclusion is wrong, see the EEG post from this week (p-hacking) or the post on masks from early 2020 (confounder - did subject actually wear mask after control/treatment assignment)
This is a really interesting point, because it's something I come up against in my role as a union agent. My members, out of long experience with Human Resources, come into any situation with the belief that management is probably lying to them about EVERYTHING, and I have to teach them that there is a trick to it: most half-decent HR types won't lie about certain matters--issues that they don't have a stake in, issues where telling the truth will help their cause, or issues that are easily fact-checked and objective.
It's so hard to butt one's head up against the way that mistrust spreads away from all touch with rationality.
Imagine that you are the kind of person who trusts neither CNN nor FOX. You are in an airport boarding area where a screen is showing CNN. You see news of a shocking event: not a school shooting, but a person driving a car through a crowd of people participating in parade. The text scrolling at the bottom of the screen gives details of dead/wounded, and goes on to note that "motive is unclear". A press conference with Police representatives appears to support this.
Later in the day, at a different airport, you see a FOX news piece about the same event. In that news piece, the Police are saying that motive is unclear. Yet the FOX reporters have somehow gotten screenshots of social media posts apparently created by the perpetrator. The perpetrator's social media was apparently full of racist denunciations, of the kind of people who were victims of the assault. The FOX report implies that motive is easy to discern, even if they repeat the official statement that motive is unclear.
The perpetrator was a Black man, and the victims of both the racial hatred and the attack were white.
What information do you gain from these two stories?
I wish this were a hypothetical example; it is not hypothetical. I must credit David Friedman with noticing, though he noticed this distinction at two major newspaper websites rather than on FOX/CNN channels.
Welp. In CS there is a really obvious measure on how to handle untrustworthy sources:
Ignore them in your decision making, but track their reliability. If their reliability score recovers enough, you can again give some measure of trust to them.
The same with FOX News and the Springer Presse (aka BILD, Welt and others) here in Germany..
Concerning the Ivermectin thing:
"1. If you just look at the headline results of ivermectin studies, it works.
2. If you just do a purely mechanical analysis of the ivermectin studies, eg the usual meta-analytic methods, it works.
3. If you try to apply things like human scrutiny and priors and intuition to the literature, this is obviously really subjective, but according to the experts who ought to be the best at doing this kind of thing, it doesn't work.
4. But experts are sometimes biased.
5. F@#k."
This doesn't imply that there is a paradox, it implies that it's probable that both are true and that your model is probably too simplistic (i.e. the effect is real, but the proposed mechanism isn't)
It seems to me that the article is about two very different things. One is the limits of how much gets made up, though it seems to me there was something false about babies being taken out of incubators during the run-up to the war on Iraq.
Perhaps atrocities are more likely to be inventions, and speaking of Yom HaShoah, I gather that one of the reasons accounts of the holocaust were being discounted was British lies about German atrocities in WWI.
Excuse me, I might be a little distracted. However, it's quite possible for the mainstream media to get some things wrong, whether they're making up lies or accepting other people's lies.
Maybe they're mostly likely to be trustworthy about medium-intensity things. Big enough that they might be paying attention, but not so big and emotional that they can't think straight. This is only a theory though.
The other claim in the OP is that it's possible to pull signal out of media noise if you know enough about both the media and the world. This might be easer when the government has centralized control of the news.
But when there are lies and nonsense coming from several directions? Maybe not impossible, but a lot harder.
I'm reminded of a favorite bit from _Illuminatus!_. There was a man with file cabinets full of clippings* about the first Kennedy assassination. He kept gathering information because he thought there was one fact out there which would make it all come together. He didn't realize half of it was lie made up randomly by people covering their asses.
This is such a remarkably biased article about bias in the media, like so many bizarre hot takes these days I am left with my mouth hanging open. FOX is biased but they wouldn't tell an obvious lies and make up events out of whole cloth? No, but you would have to be blind deaf and dumb to not know that CNN, MSNBC, CDC, FBI, and FDA et al, would. This is where common sense comes in. And common sense seems to be more common amongst common people. Privileged people living the high life can afford to believe things that are obvious lies to less privileged people. The lies of MSM about BLM and C19 alone are mind boggling and terrifying. Lies have been blatantly told about major cities on fire, looted, under siege; Blatant lies told about C19. How do you figure tens of thousands of doctors saying Ivermectin does work? How do you figure tens of thousands of doctors and nurses, etc refusing to take the shot that obviously doesn't work and is harmful? This was a truly weird article. Go talk to some normal, everyday people who live in normal everyday places. Never mind. I don't think obvious evidence would make any difference to the person who wrote this.
This entire article was very disappointing. It was basically just Scott restating that he has authority bias in as many ways as he can think of, without actually stating it.
He goes on and on, for instance, about how FOX would never make an accusation so awful and obviously wrong that it couldn't be trusted regarding election integrity, while seemingly being totally unaware that the blue media peddled a completely made up, totally untrue, zero basis in fact "Russian Interference" narrative for three consecutive years, and a very large number of people bought it. Most of them still do buy it.
Then he pivots over to ivermectin and runs "trust the experts" again, but we know for 100% fact that "the experts" suppressed the lab leak hypothesis for fear they would get called a racist by their tribe, and we know that Peter Daszak proposed to manufacture a virus exactly like Covid-19 to DARPA in 2018 *and* to do that manufacture in Wuhan, *and* he orchestrated the Lancet Letter that said lab leak was impossible two years after he literally proposed to build the thing in the lab where the lab leak happened.
So we have evidence of "the experts" lying for tribal reasons, we have evidence of the media apparatus "the experts" use lying for echo chamber reasons. The useful thing for Scott to do would be to start by setting ivermectin aside completely, and doing the smallest remotest bit of work to understand how these tribal social mechanics can, have, and continue to make an entire body of experts *wrong*. Completely without a secret conspiratorial cabal, experts are choosing as a flock to be wrong on purpose, and we know this is a thing that is happening. Unpack that, figure it out, and then come back to ivermectin.
And good gracious, this line of reasoning:
1. If you just look at the headline results of ivermectin studies, it works.
2. If you just do a purely mechanical analysis of the ivermectin studies, eg the usual meta-analytic methods, it works.
3. If you try to apply things like human scrutiny and priors and intuition to the literature, this is obviously really subjective, but according to the experts who ought to be the best at doing this kind of thing, it doesn't work.
4. But experts are sometimes biased.
5. F@#k.
...should end at step 2. Or, if you do go to step 3, you should at least apply the same level of rigor to ivermectin that you used on fluvoxamine a month later, which you obviously didn't do, and any critical readers noticed. So then the question is "why did Scott Alexander spike ivermectin while unspiking fluvoxomine?" and the easy answer, the one most likely, after we've done all of our "why do experts lie" analysis up front, is either tribal ingroup signaling, or it's financial pressures to maintain subscriptions, which is exactly why the "Russian Interference Narrative" ran for three years anyway.
To put a fine point on it, Alexandros used your own methodology and simply completed it where you got lazy, and the conclusion flipped from "ivermectin doesn't work" to "ivermectin works," and your only response was "it's not about paper selection but about endpoint quality." That's just word salad, dude.
I personally don't even care if ivermectin works, mostly because I'm not particularly scared of Covid, but I am fascinated to watch the Sensemaking Crisis demolish my most trusted thinkers and I'm horrified to think about what that means for the human race in general.
Point 2 gives way too much credit to ivermectin: "If you just do a purely mechanical analysis of the ivermectin studies, eg the usual meta-analytic methods, it works."
That's not the case at all. Multiple meta-analyses that try to exclude fraud and highly biased studies have found that ivermectin has no effect.
It's actually quite instructive your first link is the Andrew Hill meta analysis, because he admitted on zoom to faking it because he's on big pharma's payroll.
This is the sort of stuff fundamental to the sensemaking crisis. And it's the sort of stuff Scott was trying to end-around with his worms article. And he was doing it the right way, until he got a little lazy (intentionally lazy?) and tried to manufacture a landing pad for himself that didn't put him in the IVM camp, either for $ reasons or reputation reasons.
"he admitted on zoom to faking it because he's on big pharma's payroll."
Literally nothing in that sentence is true except that Hill had a Zoom call.
He didn't admit to faking anything. To the contrary, his prior results were faulty because they relied on studies that turned out to be fraudulent or highly biased. No one has yet identified a factual or statistical error in Hill's revised article, or given a reason why it should put more weight on highly biased studies.
Moreover, he didn't say anything about being on "big pharma's payroll," which would be a weird thing to say when it's not true.
The zoom call was far earlier than any accusations of fraud. And he admits that his sponsor determined his conclusion. That right there is academic misconduct, and in context, a crime, regardless of whether ivm works or not.
Not clear what the dates are. And in the video, he doesn't admit that his sponsor determined his conclusion. Not sure why some folks seem to have such a different interpretation of this video.
I'm not Alexandros, and Alexandros has sometimes confused "motteposting" with me. We're not sure who he is yet.
It's pretty interesting watching the differential in discourse regarding these topics between the comments section and r/themotte. There appear to be different factions of Scott fans who approach sensemaking differently.
" Or, if you do go to step 3, you should at least apply the same level of rigor to ivermectin that you used on fluvoxamine a month later, which you obviously didn't do, and any critical readers noticed. "
Even if the evidence left standing on ivermectin looked as good as the evidence on fluvoxamine (not true), they would still be in very different standings. Fluvoxamine hasn't been the subject of numerous frauds and highly biased studies, and isn't constantly being plugged by snake oil salesmen and their fans (*Alexandros) as being a miracle cure, etc. From a rational perspective, a drug with a solid 30% effect (but no wild exaggerations and frauds) is more likely to be valid than a drug with a 30% effect that is mostly promoted by hucksters.
You falsely accuse me of saying things I have not, but of course this is par for the course. However, the amount of junk studies associated with Ivermectin has not been uncharacteristically high. About 20% of studies are generally expected to be in that category, and results from the Ivermectin literature are actually lower than that, believe it or not. On the other hand, studies from the middle east are well known to be of very low quality most of the time, which we have also seen with Ivermectin. Once again, nothing out of the ordinary, except for the poitics.
Correction: Ivermectin has been plugged almost exclusively by true believers like Kory, Lawrie, Marik, etc. They have lied about it being a miracle cure, and have promoted fraudulent studies without shame or apology. You have mentioned repeatedly that you donated to them (FLCCC). Nonetheless you haven't *explicitly* endorsed their "miracle cure" statements, but instead when pressed, back off and say you care about the rationality of the discourse, or the information ecosystem, or something like that. Fair enough.
I've donated to the people who have also been giving fluvoxamine for a year, and the people who were ahead of the curve on steroids for severe hospitalized cases. They also believe ivm works. Btw, this wording on ivm starts from Satoshi Omura himself, and of course nobody blinks when the vaccines get the same kind of characterization regardless of their limitations.
Not sure what the relevance of a 2011 article on ivermectin is here. It does say ivermectin is a "wonder drug"--for many different parasitic infections. But so what?
By analogy, Gleevec is a miracle drug for certain types of leukemia, but that doesn't mean anyone gets an automatic pass if they say, "Gleevec is a miracle drug for literally any other random and unrelated disease, now including Covid! Here are a bunch of crappy and often fraudulent studies, along with some eyeballing of international charts that don't even pass the laugh test, along with some cites to people who are so dishonest that they claim bicycles kill 1,000 times as many people as Covid (e.g., Marik in the recent Ron Johnson hearing)."
"If five out of 30 trials have serious problems, perhaps that means the other 25 are up to snuff. That’s 83 percent! You might be tempted to think of these papers as being like cheaply made light bulbs: Once we’ve discarded the duds with broken filaments, we can just use the “good” ones.
"That’s not how any of this works. We can locate obvious errors in a research paper only by reanalyzing the numbers on which the paper is based, so it’s likely that we’ve missed some other, more abstract problems. Also, we have only so much time in the day, and forensic peer review can take weeks or months per paper. We don’t pick papers to examine at random, so it’s possible that the data from the 30 papers we chose are somewhat more reliable, on average, than the rest. A better analogy would be to think of the papers as new cars: If five out of 30 were guaranteed to explode as soon as they entered a freeway on-ramp, you would prefer to take the bus.
"Most problematic, the studies we are certain are unreliable happen to be the same ones that show ivermectin as most effective. In general, we’ve found that many of the inconclusive trials appear to have been adequately conducted. Those of reasonable size with spectacular results, implying the miraculous effects that have garnered so much public attention and digital notoriety, have not."
The selection of the 30 trials Heathers wrote about is for the biggest results. Obviously once you select for outliers you get outliers. This is elementary, really. And we don't even know exactly which studies those are. The crusaders of transparency are shockingly opaque.
We do know exactly which studies have been critiqued by Heathers et al., because they did it publicly.
And the point about selecting for outliers cuts in the opposite direction of what you're implying here! If Heathers et al. selected the 30 (or so) most well-known studies with the biggest results, they had a sample of studies that provided the best case scenario for ivermectin working. And even then, it turned out that the studies with the biggest effects were the most unreliable, whereas the studies that looked defensible were inconclusive. So the selection bias (if any) works in exactly the opposite direction of what you're trying to imply--even with the "best" studies for ivermectin on the table, ivermectin still comes up wanting.
Very good piece and pretty much the way I myself navigated the news for the last few years: confident I could recognise spin from what is likely to be factual and just shrug at spin and bias. However, the game changer in this pandemic was the way the idea of a potential lab leak of SarsCov2 was dealt with by the scientific community. The sheer firepower deployed by top scientists (and big tech and mainstream media) in attempting to torpedo the idea of a possible lab leak (and the careers of any one who gave it credence) became paradigm shifting for me. We learned of Gain of Function research carried out at the Wuhan Institute of Virology, NIH funding, Kristian Andersen professing to believe the virus looked engineered in private (email to Fauci) and calling the same idea ‘tinfoil hat conspiracy’ publicly, the shenanigans with the WIV Coronavirus database, RATG13 etc.
To stay with the blog’s metaphor it was the equivalent of discovering that Fox News had indeed staged a mass shooting and was threatening those in the know to keep their mouths shut.
You basically say ‘there are rules and savvy people know what they are’. This is true, but shouldn’t we all be constantly recalibrating on the basis of what suddenly enters the realms of possibility? The scientific community made an extraordinary attempt to control the narrative on the origin of C19 in a bona fide conspiracy to obfuscate and intimidate and smear anyone who knew better (with the help of big tech and the press, of course). Isn’t this a new ‘rule’ I have to take into account from now on? How does this affect the meaning of the expression 'spreading misinformation'?
Interesting to see so many commenters here saying "yes, yes, UNTIL things changed with [COVID/Trump/BLM/insert issue]!" Nothing has changed, fundamentally. You're still basically just determining how much confidence to assign to each piece of information you receive. I wonder how much of an emotional component there is to this way of thinking. People get pretty insulted or disgusted to discover they were misled about something. Committing to disbelieve the source absolutely in response, however irrational that actually is, might function as a way of hitting back.
Another idea is that maybe these people believe that overreacting to instances of media untrustworthiness will eventually improve reporting standards.
This seems insightful and useful - de recently had in-depth discussions with colleagues who generally each carried what to me is a conspiratorial perspective on COVID and government response. They were actually enjoyable conversations on balance, we kept rapport, and I began intuiting underlying differences in thinking to account for my perspective seeming categorically different from theirs. One of them is scientific literacy. My greater capacity there though, may have also let me lose sight of some larger picture stuff which they were frustrated and suspicious of: "They told us the vaccines would work and now they don't". I could counter that with points about probability and risk and viral mutation - but I had extracted so much fine grain signal that I had forgot the basic point that they were pointing out: vaccines were represented as our way out of the pandemic and it is not working out that way. Which has implications moving forward (I don't believe it's because of some heinous conspiracy as they may be tempted to, but the point stands, and my perspective shifted quite significantly).
> I had extracted so much fine grain signal that I had forgot the basic point that they were pointing out: vaccines were represented as our way out of the pandemic and it is not working out that way.
This is a good point. The fine-grain version of "vaccines are our way out" is still true in a substantial (if incomplete) way, but if you hear it as something simpler that implies "if you get your shots, we'll stop trying to tell you what to do", then you're going to be pretty disappointed.
I agree that talking to people who have a hard time extracting signal can be helpful. I'm less savvy about various geopolitical things than some of my friends. I'll say something to them like "I can't tell how much of this stuff with <potential conflict> is a literal statement of intent, vs some calculated diplomatic/political move that shouldn't be taken literally. How much more worried should I be?". The answer is usually something like "It's complicated and hard to say, but you can at least conclude that <signal they've extracted>, so I'm <more/less/about the same> worried". This usually doesn't improve my ability to extract signal very much, but it does help me calibrate my confidence in that ability. This is important, because it means I don't have to adopt a stance of complete epistemic helplessness.
I think your statement”I think some people are able to figure out these rules and feel comfortable with them, and other people can’t and end up as conspiracy theorists.” is a bit condescending. Dr. Fauci’s as well as many other World renowned experts, year long denials that the virus could have come out of the lab, is now being challenged by a large number of scientific experts who are not know being labeled as conspiracists.
Can you quote some of these "denials"? If they're along the lines of his 2020-04-17 comment that "the mutations that it took to get to the point where it is now is totally consistent with a jump of a species from an animal to a human,"[1] well, that certainly isn't saying it's not possible that it came from a lab.
It sounds to me as if you missed this bit in Scott's article: "before you object that some different global-warming related claim is false, please consider whether the IPCC has said with certainty that it isn’t, or whether all climatologists have denounced the thing as false in so many words. If not, that’s my whole point."
Disney only acquired some of the Fox media assets. In short, Disney got the arts & entertainment, but an independent company still controlled by the Murdoch family kept the news and sports.
If we're being told what the government wants us to be told, then why do some media outlets report critically on the government?
For the millionth time, puts something nebulous into concrete, precise terms and fleshes it out. Kudos. Bonus points for talking about conspiracy theorists like they're human beings - far too much of the discourse on them assumes they're deranged and tries to figure out the psychology behind the derangement, without ever considering that news sources and experts *are* disingenuous and lazy a lot of the time and it's perfectly natural to notice that and overreact to it a little. Sure there are people who take things way too far and make a whole lifestyle out of it, but I find it a little disturbing how much discussions of MSM mistrust focus on the tinfoil hat crowd, to the exclusion of people who...just don't trust the media very much.
From all the replies I gather that the author is pointing out something that can seem obvious to a small group of people, but by no means all ‘savvy people’ in general, and second to the view that many good points are raised, but a bad aftertaste is present. Ultimately it is poor practice to talk with dichotomy and spin the media bias in the author’s favorable direction.
I'm not sure the stuff about 'glorious' vs. 'good' harvests is consistent with bounded distrust. Do you think there is a bound where the Russian government wouldn't say 'glorious' when there is a bad harvest, or that the govt. would say neither 'tax increases nor revenue enhancements' but introduce 'progressive policies'?
It seems that bounded distrust makes sense in repeated games with monitoring, but that the government and parties raise some much noise that monitoring the signal doesn't happen and bounded distrust doesn't work any more. Maybe it works for science and journalism.
"Ivermectin does not work" and "man made global warming is real and dangerous" are both important points of the Left dogma.
If I am on the Left and I use my Figuring-out-the-rules-of-the-game skills, and I exclusively end up approving bits that are part of the Left dogma, my Figuring-out-the-rules-of-the-game skills should be highly suspect.
I should therefore really, really be looking to approve a sufficient number of equally highly valued bits of the Right dogma. I would search for them, record them, and put them on my wall in picture frames.
I think it is a great overcomplication of two simple rules:
1. Is it profitable/convenient/useful for the perpetrator of the lies?
2. Can she easily get away with it?
If the intersection of these 2 factors is sufficiently good, lie will be perpetrated.
I always use this criteria and am right in nearly %100 of cases. Of course, it comes mostly form the generalized Left. Right-wingers that try this don't live long, politically, professionally and physically. We still have some standards!
And, yes, of course, being in airport and hearing Fox News on TV is as implausible as UFO stories. The only location where I saw Fox on TV in a public place was certain kosher restaurant on 47th between 5th and 6th and it happened probably over 6 years ago. Don't know if tehy survived COVID.
Scott's reflexive hatred of socialism shines through all half-hearted attempts at seeming objective. Sad, but I suppose it is inevitable based on the rightist slant in even our "moderate, centrist" discourse.
Awesome. This seems to omit discussing that our very standards of truth are often downstream, internalizations of these external institutions "truths". Most peoples "truths" are composed by a fractured, socially reinforced memes that have very little to do with reality testing, cohesion, or explanatory power. For these people, the heuristics of __don't/do trust the experts__ isn't as important as "does this vaguely jive with my internalized metrics of when I believe something".
This seems important to mention because conspiracy theorists do have experts and authorities, they just code and signal their "truths" in a way that conforms better with the conspiracy theorists' internalized truth metrics. Maybe you were just going for the point of "selective sampling explains the complexity of some people's heuristics" in which case I would simply agree, and want to add that people generally avoid model complexity and prefer the buckets of "yes, no, and maybe".
It seems like you chose a weird set of possible motivations. It seems much more likely that power simply supports the status quo, and it's much easier to say "whatever, just wait until the next election cycle" than "hey, we should overthrow this illegitimate government, which I am a part of"
The best a priori argument against the election being stolen is the absurdity of the freaking President of the United States, the most powerful man in the entire world, with enormous intelligence and law-enforcement resources at his fingertips, and a very wealthy man in his own right, (1) being caught by surprise, so that he could not prepare for it in the four long years he had in office, and (2) being unable to do anything about it but sputter and speechify after the fact. For both those things to be true Trump would have to be weirdly smart, so he could realize it was happening earlier and more certainly than anyone, but weirdly stupid, so he would fail to prepare for the possibility over his entire term in office.
I guess the second-best argument is that if someone is *going* to steal an election, they don't do it by razor-thin margins, for the obvious reason that this makes the outcome far more plausibly debatable. Elections *are* stolen in the Third World, and they used to be in the Communist bloc, all the time. But they are stolen by enormous, almost laughable margins. 98% voted for Dear Leader! What an awesome show of support! Et cetera. Nobody steals an election by 0.05% of the vote, that's like going into a bank and doing an armed robbery for $5 to buy a cup of coffee.
It's sort of a never-ending signals-and-detection arms race, true. No stable equilibrium in sight where The Ones Who Understand How To Read The Signs can finally lower their shoulders.
>And that suggests to me that the fact that there is a petition like that signed by climatologists on anthropogenic global warming suggests that this position is actually true.
That they're willing to sign this means that it's both politically correct and true. When the truth isn't politically correct, that's where you get hinting at things without quite coming out and saying them.
By "politically correct" I mean politically correct for the sphere in which these people are operating. By "true" I mean they're sure enough about it to put their professional reputation on the line by endorsing it.
'That they're willing to sign this means that it's both politically correct and true'
'By "true" I mean they're sure enough about it to put their professional reputation on the line by endorsing it.'
I'm not sure there's a difference. If something is politically correct, that means it has taken on the attribute of being True in some layer of the simulacrum. Something being "true" in base-level physical reality has very little bearing on the layer above it, where Truth is decided by "consensus".
I find it highly unlikely that any observations from reality could possibly change the decided establishment position on any issue.
That is a very interesting paper indeed, thank you for the link. RNA just keeps getting more and more interesting, and hard to pin down.
What was posted there? About Ivermectin or mRNA?
It was a link to a very interesting Nature paper on long non-coding RNA, and some detailed work that suggested it had a *structural* as well as regulatory role to play in the epithelial-cell tight-junctions that keep capillaries from leaking. I did not keep a copy or link, but you can probably find it or related papers again by googling LASSIE (which is an acronym for this particular long non-coding RNA) and tight junctions.
Ivermectin bores me to tears, so not that of course.
I think the problem there is selectiveness. Invermectin might have some vaguely plausible method of action, but so would a thousand other drugs. The question is why are we examining this drug in the first place, and if it's not for a very good reason our prior probability of it being effective should be low
I just read the "History" section of the town's wikipedia article. What in particular should be of interest?
Because of the weird Kellogg stuff, or why?
I think that at this point that's a general complaint of everybody against every news source, not specifically against Fox News.
On occasions when I've encountered a complete Fox News presentation, rather than a link to a single article (e.g. waiting rooms with media tuned to Fox News) I've been struck especially by the selection of news stories, compared to their competitors, and especially to the BBC.
The world according to Fox is much scarier and more dangerous than the world according to CNN et al., which is itself much nastier than the world according to the BBC, without dipping into anything clearly labelled as opinion. Fox News goes deeper fishing for bad events to report, especially violent ones. And the BBC plainly puts a lot of effort into finding and publishing positive news stories.
Sometimes specific media have other quirks. My local paper presents a lot of stories with victims outside of the white-cis-Christian demographic, and stresses the demographics of those victims. It also likes to feature members of the latest victim's demographic community bemoaning increased or persistent targeting of their demographic. While it's conceivable that the proportions presented accurately portray the proportions occurring, I doubt it, simply because the victim demographic is only prominently presented when non-white, non-Christian or similar.
“ The world according to Fox is much scarier and more dangerous than the world according to CNN et al.”
Strongly depends on the topic. Listening to CNN every single American has died from COVID at least 3 times over. Well, at least those that weren’t killed by systemically racist gun violence first.
CNN has gotten a lot worse since Trump, and especially since COVID. Pre 2016, I’d probably agree with you.
My only exposure to Fox is through the clips that the YouTube algorithm pushes at me, and I have noticed exactly what you describe.
Furthermore, you can glean the respective political bias of both Fox and CNN just by reading the YouTube thumbnails…there is no need to actually watch the videos.
(Spoiler alert: the caption is always some variation of Our Guy - Good! / Your Guy - Bad!, with Your Guy - Bad! predominant
I suspect most of us who comment here on ACT have better than average BS detectors…I think I do, but have no idea how I developed it.
:prepares for comments section filled with people giving counterexamples for all the things Scott just said the media doesn't do:
I think part of the problem is that sufficiently advanced ability to find people who are lying and be unreasonably credulous to them is practically indistinguishable from lying - there's rules, but the rules have exploits so wide that one can reasonably call the game broken.
Haha, yes that’s what I was just doing in another post, running a cruise ship through those exploits. It almost calls for the humour of the original Charlie and the chocolate factory ‘tell me more about…’ head resting on fist meme, not to be mean, but because it fits and is always funny.
It is a game of inverted totalitarianism with examples of all behaviours which are there to fool you, sometimes they lie about, sometimes they make stuff up entirely, sometimes they twist things so far from base reality it may as well be made up. The media do all sorts of lying snd some of it is wink wink ‘business’ news that is just corporate counter intelligence or lies from the government where we pretend unemployment is under 5% when we have a workforce participation rate under 70%, etc. but there are other lies mixed in there and some buried story with a handful of truth is the media’s continual counter example, even if it is only 5% of their stories and never a front page headline.
Turns out they are talented liars and play against our well known strategies to spot the lies. Just like how con artists have an easy time stealing from over confident doctors. Knowing your mark’s weaknesses is essential and a smart con doesn’t mean you have to be smarter than your mark.
You thinking you know what’s going on…is part of their model of propaganda. We can sit and think to ourselves, advertising doesn’t work on me! I know it is a lie. Some of the broader propaganda model is the obvious wink wink nudge nudge game and that exists as a meta layer of lies to lull even the vigilant into complacency so they can slip in other lies through that loosely woven net.
They wouldn’t do that! Dan Rather is just such an upstanding guy and oh so square jawed! They wouldn’t make up stories with no basis in reality, getting actors is hard so they just insert footage of a totally different protest or riot. Which I’ve seen them do many many times with old riot footage from different cities and different countries even! How is doing that on purpose any different from paying actors to do it? Lazier, cheaper, and just as big a lie.
Was that teenage boy with a gun at that BLM protest turned riot an avowed racist doing racist things for racist reason as Biden and every left media outlet said pre trial as he gunned down innocent blacks people?
Do the core events of an armed teenager ‘defending’ a business owned by non whites as he was attacked by a convicted white pedophile matter? When a random group of 3 adult men attack a boy by running him down, saying they want to kill him, smashing him with a skateboard, and trying to take his gun from him…and he shoots and kills 2 of them in the base reality….How far from reality can you get before it just doesn’t even matter what reality was?
surely you understand that by using the term "defending", you're unnecessarily biasing your supposedly objective retelling of the story
There are scare quotes around it, his stated purpose being there was to defend the property of his community (he worked there) as well as provide 1st aid.
He was clearly there in an attempt to provide a positive presence, certainly more positive than those protesting who caused billions in damage.
Who cares what he believed his stated purpose to be? What are you, a Kantian?
He was an untrained guy with a gun at a riot. That’s what police are for, because they’re actually trained in riot management. We don’t want vigilantes roaming the streets during riots, precisely because of the effect it evokes from irrational riotous criminals like those he killed. He was a negative presence at the protest, and it shouldn’t matter than the attempt to provide a positive presence. It should have been obvious to him that what he was doing was reckless and unhelpful, but he certainly doesn’t seem like the sharpest knife in the drawer.
I wouldn’t be so snarky if their style of argumentation weren’t completely at odds with rationalism. I’m not saying one has to be a fully robotic utilitarian, but to focus entirely on intent means one is biased.
The situation can be analysed as an forever-ongoing signals-and-detection arms race between actors in informed and uninformed information positions.
...a general characteristic of much human interaction, really.
Why? The media company's goal is to reenforce their consumers' worldview by feeding them bullshit; the goal of the media consumer -- maybe not for you, and definitely not for Scott, but the average consumer who is being targeted -- is to find someone who will feed them bullshit that reenforces their worldview. I don't see where there's any conflict of interest that would lead to an arms race.
I guess in exceptional situations, like when threatened by a global pandemic, your average reader might become more interested in knowing the actual truth than in having their pre-existing beliefs reenforced. In those specific cases, the situation might temporarily become very arms race-like. But I don't see any reason for that to be true in the general case.
"The media company's goal is to reenforce their consumers' worldview by feeding them bullshit;"
Why would that be their goal? Is that one of their behaviours or effects instead perhaps?
It's certainly an instrumental goal at least, insofar as the terminal goal for the media company is to turn a profit. Whether it's a terminal goal is more debatable.
Curses! Foiled again. I was about to mention Sixty Minutes manufacturing false data. (Planting an explosive to make a certain car's fuel system look dangerous. )
That was Dateline NBC. Sixty Minutes got taken in by the fake memos, and kept on doubling down until management called in an outside investigator who said "wtf."
Yeah if handled a bit incorrectly it'd result in over prioritizing prior. Seems like an unstable balance (like free speech) while unfortunately, "distrust everything" is a stable balance (like full censorship), just like trapped prior taught us
Man, every day I discover/someone points out some finely tuned heuristic I have running all the time.
I had the same thought. I kind of think of it as reading different kinds of graphs. At first, you might look at something and think it looks weird, then it's like "oh the scale is log10" or "x and y are counter to how I would have labeled them" or "the units are weird", etc, and once you know the lay of the land, you can decode the information and learn things.
This is a really good comparison. I'm reading the article & frowning & suddenly I'm like "oh wait the dancer is going the *other* way!"
A big part of the frustration of the moment is how much the rules of the game changed during the Trump presidency, at least for mainstream liberal media, and even for institutions. I thought I was reasonably familiar with the rules of the game, but things like the CDC statement on the BLM protests and the censorship of 'lab leak' theories caught me off guard – I would have previously said those weren't the kinds of lies to watch out for. It's been a difficult and frustrating adjustment.
This post, the attendant discussion, and others like it are valuable historical artifacts. Like the kids in the fairy tale dropping breadcrumbs, leaving a trail, these create markers. Later on we can read it and say “we were at that understanding in January 2022, and now it’s changed to (x).”
Nuclear/toxic exposure is a context where the “good harvest” approach has I think already been in effect. Even in more recent times with the military burn pit exposures - the agency saying there’s no problem, lined up versus thousands of sick people - in those contexts people assume the agency is lying, or a few individuals have prevented the agency from really looking, or even multiple careerist individuals have found it more beneficial not to look.
There’s an amazing research work called Wolves of Water. A guy in the UK was living with his family near a coastal site with some type of nuclear waste disposal in the water. His daughter developed cancer and he launched into a study of the situation and ended up correlating distance from the coast with cancer risk. During the Fukushima accident I wasn’t paying attention but I started noticing a few years later with the starfish die-off. I wound up on internet sites where people were posting their own atmospheric radiation data. A lot of it has been deleted now - the sites deleted - meaning, there went all that data.
The boundaries of what it’s acceptable to say about science have some less obvious frontiers. That’s one of them, the whole climate modification situation is another.
Something about COVID, it combined the “tell lies about toxic exposure” tendency within government, with a groundswell of millions of people needing to know the truth. Airborne things were usually radiation before, which “blows away” or dissipates or creates cancer rate spikes three years later when it’s almost impossible to really connect. Plausible deniability was baked in.
The conspiracy theorist gray area around toxic exposure is structured differently from what the post describes, I think. When it’s ideas, yes, it makes sense. When people are amassing competing medical data it gets harder to call it misunderstanding - even when there are plenty of places where reasonable people can disagree.
Epidemiologists found themselves saying things previously reserved for the nuclear folks via Covid, that’s one of the changes, I think.
You mean the “bald faced” kind?
The trump presidency def seems to have created an inflection point in the truth/BS data stream.
I am far from a Trump fan, but I found it fascinating how the press stopped even pretending to be objective once Trump rolled around.
This exactly. The media very much, and frankly very openly, tore up the rule book once Trump was elected. They tried to justify it by saying that essentially Trump broke the rules first, in a way that exploited the rules and made it impossible to report on him in a normal way. And I’m actually sympathetic to this! Trump DOES lie in a different, more bald faced sort of way than the average politician.
But the journos did not limit themselves to Trump’s bald faced lies, or even to Trump himself - now their favorite phrase is “said, without evidence” for whenever a GOPer says something they disagree with.
I don't know if I'm sympathetic or not... It seems they (media institutions) were faced with a crisis, and on the whole chose to meet it with the power of the Dark Side.
Were they faced with a crisis?
Bad Politics Man Made it Into Office will happen occasionally , regardless your political affiliation.
The fact that the media treated a *gasp* not-elite *gasp* conservative being elected as a DEFCON 1 event *is* the kind of institutional bias that conservative "conspiracy theorists" are going on about constantly. I think they are wrong on the factual matters, but when *the entire elite establishment* including journalists, researchers, etc have made it transparently obvious that they're in the tank for "whoever isn't Trump" - if you're a Trump supporter, you have been given no actual reason to believe that they're being honest when they said "yeah all these abnormalities in election reporting are normal and happen every year".
They are experts, and as such can craft expert lies vice normal lies.
This is a problem, because the message being sent by anyone with any sort of professional expert credential during the 2016 to 2020 timeframe was "Trump is the worst thing that has ever happened in the history of the United States and must be stopped at any cost". Trump supporters, in general - responded with "message received, you will stop at nothing to keep Trump and his politics out".
Very nicely put. I feel this way, too.
Yes, I totally agree. I came here to ask for advice on how to adjust better to these (now more common) types of lies. How could we have know that "masks don't work" meant "make sure masks are available for medical personnel"? What's the lesson to make ourselves better at finding the signal in the future?
I don't think this holds up to a close read of history. Media on both sides have covered many issues very poorly. Either because they lied or just weren't knowledgable enough on the subject to understand the truth. Its also a lot easier now to find criticism of all sides of the media as well as easier for topic experts to weigh in directly about topics that they know first hand.
Right - this is what a lot of liberals have missed. I went on a spiritual journey in Asia and tuned out the news from basically right after Trump got elected until Covid arrived. It took me a bit to figure out what was going on, but it ultimately became clear that the rules of the game radically changed during my three year absence. Lab leak and BLM narratives are good examples. We went from Chomsky's "Manufacturing Consent" paradigm, which is the paradigm Scott seems to still be working off here, to a new "Manufacturing Reality" paradigm, in which all bets are off. Yes, it is still possible to glean useful information from sources like mainstream academic journals and the New York Times. But one's level of skepticism has to be taken to another level, particularly in areas where there is a clear and established narrative. In those areas, you should expect to be, at the very least, misled. And, under the new rules, you may well be outright lied to.
That was what took me from the Scott's camp to the "there's no lower bound" camp. There were many other examples - including some things that come very close to what Scott pointed out the press wouldn't do, like once when they showed a foreign hospital with a lot of sick people to imply it is happening right now in the US, another time when they showed fire range in Kentucky and claimed it's a footage of an air raid in Syria (somehow they have to switch countries still - I guess there are *some* rules?)
Even more recently, I've read a history about Supreme Court judges where journalists claimed they said and did something, and all the participants came out and plainly said "we never said and did that!" and the journalists still were "well, we still think you did and you're lying to us because reasons".
Oh yes, and who doesn't know what gave birth to the "Let's go, Brandon" meme?
But for me the trigger moment where I arrived at the realization that there are no rules anymore - or at least there are no rules that I thought there were and none I could figure out. They will say literally anything or do literally anything if they think it'll serve whatever purpose they have. And the number of people among "experts" who are willing to stand up to this is extremely small. The number of people knowing the lies are lies still large, but most of them either don't have voice, or don't want the Eye of Sauron to focus on them for raising it.
The world became much scarier that day. And it's still pretty scary.
Yeah, right now we have basically two kinds of media - one is nicely controlled, polished, has a narrative and will say anything to drive an agenda they are currently driving, and for it you being informed is actually a negative - they want you to arrive at and be secured in an opinion they prescribe, and that's their only goal. They would gladly lie, suppress or distort information, if they think it serves their goals, and they feel zero loyalty to their consumers - which they see as a raw material, not clients.
The other one is actually doing the function that the media is supposed to do - disseminate information about the current events, but they have virtually no quality controls and only very weak and rudimentary reputation mechanisms, so the quality of the information varies wildly. They would never suppress or distort what they think is the truth, but what they say could be true, or it could be figment of somebody's wild imagination.
Somehow one has to maintain sanity and be a responsible and informed citizen in this environment. It's not easy and it's not going to get easier anytime soon.
It's the Party of Evil and the Party of Stupid, although they might've switched valences.
https://www.goodreads.com/quotes/803704-we-have-two-parties-here-and-only-two-one-is
"But for me the trigger moment where I arrived at the realization that there are no rules anymore - or at least there are no rules that I thought there were and none I could figure out."
Sorry, what was the trigger moment? Apologies if I'm just selectively blind here, but you seem to have left the actual moment out.
The moment was described in the parent comment - when the "health experts" came out and said BLM protests are good for public health.
Eh... It was more like if "BLM protest lead to more socioeconomic equity for Black people then they are on net positive for public health". It was obvious Bs at the time but it wasn't an outright lie.
In other words, it asserted that people dying was an acceptable trade-off for people of a certain heredity having it better, while people of differing heredity were being told that their living normally was verboten because more people would die than if it wasn't.
That's not 'BS' though. Whoever has the power to define "public health" can assert such things; whoever has the power to implement policies under the rubric of "public health" has the authority to carry them out.
"Public health" - it's a funny old bird. A little thought will reveal that no gestalt emerges when a great many people's individual healths are assembled. Some people will be poorly, others hale and hearty, and all are wholly unaffected by whatever someone else's summing up of the delusive 'overall picture' might be, until such time as it begins to inform "public policy".
Is mild malnutrition for all an instance of better or worse "public health" than some being well-nourished and some others severely malnourished? Must a doctor be deputised by the public to pronounce on the matter? Do they get to ask for a second opinion?
There is thus no such 'thing' as "public health" in the sense of the notion of something good and objective which the words conjure up, but it is a powerful conjuration, and we have likely not yet seen the greatest works to be done in association with the incantation's utterance.
Right, but Trump himself is no innocent man in this matter. He has a skill in being able to shamelessly tell any lie with a straight face.
So many replies, and not a single one mentions that there was no "CDC statement on the BLM protests." A simple Google search reaffirms that I remembered this correctly. In fact, the *only thing* the CDC director said about the protests is that they probably spread COVID, and everyone involved should get tested. That was the director, speaking before Congress; the CDC never made any official statement as an organization.
yeah, distrust of commenters isn't bounded for me.
Ha ha
I’m sure he was referring to the open letter signed by a bunch of self-professed public health experts, which included at least one who claimed to work at the CDC. There were also some tweets from a former CDC head to similar effect. You’re correct that the above commenter has the facts wrong. To the extent that the open letter was co-signed by actual public health experts, which is how it was presented to the public, it made those experts look pretty bad.
This sounds like an isolated demand for rigor. You are picking on a small inaccuracy with words to deflect from the main point which you aren't honestly arguing against.
Yes it's inaccurate to call it a CDC statement. But there was a letter co singned by "public health experts". Now maybe those weren't real experts, or the mainstream consensus was against them? If that's your argument, then make it explicitly.
As I remember it, all sides took the letter at face value as an expression of what mainstream epidemiologists wanted to express publicly. Maybe many disagreed quietly, but I don't recall prominent officials or institutions coming out and saying anything.
I don't think it's a small inaccuracy. If the CDC made that statement, we'd be living in a very different universe.
What actually happened was very much in line with Scott's post. A lot of individual experts beclowned themselves. The media was a bit over-eager to report on this. For political reasons, a lot of people who knew better remained quiet.
But, for example, look at how CNN reported on it: https://www.cnn.com/2020/06/05/health/health-care-open-letter-protests-coronavirus-trnd/index.html
You can argue that they should have provided alternative views, and the failure to do so indicates bias on CNN's part. I would agree with this. However:
1) At no point is this explicitly stated to be a consensus view. In fact, the letter itself--as quoted in the third paragraph of the article--claims to have been created "in response to emerging narratives that seemed to malign demonstrations as risky for the public health". An astute reader, reading between the lines, would take this as an admission that the letter does not express a consensus view.
2) The article provides actual numbers. 1,200 sounds like a lot, but is really just a tiny portion of the millions of health experts in the US. Many of these 1,200 come from a single university--which is noted early on in the article.
3) The very first words of the article are "A group of health and medical colleagues..." A reader with bounded distrust would notice that no major organization gave their blessing to this letter--only a bunch of individual people.
4) The letter itself doesn't actually contain any lies. Just terrible opinions. (In fact, the content of the letter is even worse than I remembered. I would not trust any individual doctor who singed it. Luckily, I'm unlikely to ever interact with these 1,200 people who mostly live in a different part of the country.)
The article doesn't make this clear, but many signatories were not really health experts--some were even students. Yes, the media should have reported this. And, yes, more people should have spoken out. But these are all dynamics Scott mentions in the article. They're all well within "bounded distrust."
Whereas the person I responded to claimed that the rules had completely changed. Which might be true, if the CDC made this statement. But that's not what happened.
I actually agree that institutions generally became less trustworthy over the last 5 years. But I don't think we're in some totally new paradigm.
Thanks for laying this out. "There is no lower bound" type comments are admitting ignorance - which is fine - but some are trying to dress it up as a new knowledge.
Thanks for this. It does fit within what Scott's point was.
Yeah >2020 I found mainstream media is capable of lying much more than I used to
Based on the first section, this fatally fails to distinguish between Fox commentary (i.e., Tucker Carlson, Hannity) and Fox NEWS. They are different. Indeed, that's how Tucker beat one lawsuit, by arguing that no reasonable person who see what he does as news.
Rachel Maddow used the same reasoning in a lawsuit against her.
Yeah it's a cake and eat it too situation. And even though Fox kinda started that model of "news opinion as opinion news", the success of their business guaranteed it would go on to infect virtually everything.
I think it's just "'eating the cake" situation. I mean if you come to a place called "cake eatery", you expect people to eat cake there. And if you come to a place called "Tucker Carlson Tonight", you'd expect that tonight there would be a guy named Tucker Carlson telling how he sees things. I think it's pretty hard to get more "what it says on the can" than it is?
If you make specific false (and defamatory/libellous) factual claims as part of an argument, the fact that other parts of the argument were opinion does not (in my view) make the lies OK. Nor does the fact that a careful observer could figure out that you're a habitual liar, not if your core audience believes the factual claims you're making are true.
I'm not sure what's your point here. Were you trying to impress on me that lying is not OK? I know, but why did you feel the need to explain it to me? I certainly didn't tell anything that may suggest otherwise.
What everybody believes is their personal business, and I am not sure I can make any claims about Carlson's audience beliefs specifically, except to note that it's highly unlikely they'd be in the audience for long if they thought what he says is usually false. Of course, if you do think it is the case, you can always withdraw yourself from that audience.
The point of my comment was, however, that it is strange to imply Carlson is pretending to do something he is not doing when he's running an opinion show specifically marked with his personal brand. What value you attribute to that brand is entirely up to you - but it is what it is, the opinion of a guy named Tucker Carlson, no more, no less. It doesn't make him more right or wrong than anybody else, it just makes claims that he's pretending to do something he's not doing unjustified. If he claims X is a fact and turns out X is not - he's still wrong. That can happen to the best of us - for example, we just witnessed several Supreme Court members openly proclaim wildly fictitious statements directly relevant both to the facts and the law of the case they were deciding. Sad, but that's the world we're living in. At least with Tucker you know what you're getting, and if you don't like it - you can easily stop getting it.
>for example, we just witnessed several Supreme Court members openly proclaim wildly fictitious statements directly relevant both to the facts and the law of the case they were deciding.
To what do you refer?
See also: Alex Jones.
See also the Jon Stewart “clown nose on / clown nose off” behavior.
See Vox, for that matter. Somewhere in the SSCsphere is a passage complaining about how Vox will hide commentary among its "voxsplainers", where it's hard to notice unless you're paying special attention.
Except the people who accuse fox news of bias are talking about the actual news too. Everybody knows Carlson is biased, he's an opinion giver. Nobody thinks he "unbiased" in the way news ought to be. They claim that the news presented by Fox news is itself unreliable.
What I thought about the case where Fox is showing the police news conference with a suspect named Abdullah is that I would worry that Fox is showing a news conference with a suspect for an unrelated crime. They won’t make up a news conference, but they will show an unrelated one as if it’s related until authorities explicitly say it isn’t.
Leaving aside multiple cases where media of all stripes have used bad art to illustrate current news, do you have examples of where Fox has presented a conference as addressing one issue when it was about a different one?
The first example that comes to mind is the series of articles about Hillary Clinton’s emails, all presented as if they contained new information, all of which turned out to be about the same emails that had been discussed for months. I think a lot of the Trump Russia stuff was like that too - something about some Russian activity is juxtaposed with some Trump statement to make it look like they were connected, even though there is no specific allegation of connection they are making.
Regurgitating a slightly repackaged old piece of news to keep it in the news cycle is a different thing than showing a news conference for one event and claiming it’s another.
Yeah, I wasn't imagining them specifically *claiming* it's another - I was imagining a situation where there's five suspects and the police holding a conference about all of them but they just show the one, or a situation where it's unclear whether the conference is about a suspect in this case or some other case and them juxtaposing it with this case in a way that makes it seem relevant, or any of a million other similar things.
> Except the people who accuse fox news of bias are talking about the actual news too.
If I had a nickel for every time someone followed up "Paper XYZ said [outrageous thing]" with a link to something clearly marked as an opinion piece, I'd be a wealthy man. There are claims to be made of bias in actual reporting, but an awful lot of the volume is clearly coming from folks who fail to make the distinction.
That's true, but it's kind of a subcategory of "knowing the rules."
> Except the people who accuse fox news of bias are talking about the actual news too.
I feel like this isn't obviously true! I mean, of course there's people who think all sorts of things, but I'm very often hearing people distinguish between the news side and the opinion side.
Maybe I'm just in a bubble, but maybe you are!
One problem, which Fox and MSNBC and CNN have created for themselves, is muddying the distinction between their news and opinion sides. They will have Hannity or Maddow anchor election coverage and give news updates, while also opining wildly, making it harder to parse for a person without a background in news. I don’t think Fox was the first mover on this front, but they are really bad about it, so I understand why people find it so easy to denounce them with blanket statements.
I could be wrong, but I don't think MSNBC has a news side, it's all opinion. Or rather, NBC's news side is just "NBC News", without the "MSNBC" branding.
It’s fuzzy, but there is spillover between MSNBC and NBC, with Lester Holt working at MSNBC first and now anchoring NBC nightly news. NBC News has a lot of arms, and MSNBC is the most clearly editorial, but it’s also marketed as a news network.
This may not be the current status but I hinge my opinion on Fox News coverage on a study rating the ratio of positive to negative news articles about McCain and Obama in their presidential race
CNN et al were in the area of 80% positive Obama stories and 20% positive McCain while Fox had a narrow spread, about 10 points. It was something like 55% positive McCain and 45% positive Obama stories
Much of the 'fake news; THEY are lying' crazy occurs because somehow, we have all apparently abandoned any difference between facts and opinions. "A happened" (such as a mass shooting the FOX example) is an observable, testable fact. "Therefore B should happen/become law/be done" is an *opinion* that we might agree with or disagree with.
Your point about the difference between Fox NEWS and commentary is really IMHO a broader point to much of news, authority, or social discourse.
"I believe that in some sense, the academic establishment will work to cover up facts that go against their political leanings. But the experts in the field won't lie directly. They don't go on TV and say "The science has spoken, and there is strong evidence that immigrants in Sweden don't commit more violent crime than natives"."
A possible exception to this rule: https://twitter.com/Telegraph/status/1481176891998490624
Emails from the start of the pandemic show that some of the leading scientists working on emerging viral diseases thought that a lab leak was reasonably likely but then they signed a letter in the Lancet saying the exact opposite (https://www.thelancet.com/journals/lancet/article/PIIS0140-6736(20)30418-9/fulltext).
I came to say this exact same thing. I feel Scott is taking the wrong lesson from the current media landscape. He has assumed the old rules still hold sway. But the rules are changing under his feet and his confidence is sorely misplaced.
That letter condemns conspiracy theories and very carefully avoids saying that the virus could not have found its way to humans through research activity. I think it's a pretty good example of the kind of not-quite-lying that experts do all the time.
That letter should be read as political speech, not science.
My point isn't that it's not bad. My point is that you should disbelieve letters like that.
[edited to add]: To be clear, I agree it's really bad for a lot of reasons. The fact that it is political speech is itself bad. I just don't think it's a counterexample to Scott's argument.
I think I agree, but I think there's also an overton window on science communication, and stuff like the letter(tm) pushes everything in an ungood direction. I keep waiting for someone to finally come out and say "Look, it was a crisis moment, there was plenty of reasonable doubt in both directions, and we couldn't afford to have talking heads on the 5 o'clock news impugning the country that not only had the most data on the virus, but also makes a substantial chunk of our meds and ppe based on scant speculation" That to me is a perfectly reasonable defense, if someone would have the guts to say it.
To elaborate a little bit more:
I think one correct takeaway from the letter is that there probably is (or was at the time) some real genetic evidence that rules out or pushes back strongly against at least some varieties of non-natural origin. Another takeaway is that the question of the virus's origin is a politically-charged topic, and that the scientific community is probably going to be pretty biased in how they approach it. When a letter talks about "standing in solidarity" and "fighting disinformation" and only one sentence out of 18 makes a scientific claim, you should assume that it is mostly unreliable.
The elephant in the room is the censorship practiced by the big social media platforms, which spread to a lot of the "blogosphere". So opinions and evidence that ran counter to the guidance from the CDC and WHO was suppressed as "misinformation". The party line was / is that Hydroxychloroquine and Ivermectin were not only ineffective, but HCQ was dangerous due to a chance for heart issues. Yet there is a study from 2005 published on Pub.Med (an NIH website) that concludes that HCQ is effective against SARS viruses when given early and is well tolerated. The approach of Uttar Pradesh, where teams actively sought positive Covid cases and provided a kit that included Ivermectin and other palliatives, appears to have been quite successful.
The biggest red flag for me, besides the suppression of dialog, is the interference with the doctor / patient relationship. HCQ and Ivermectin, for example, are widely used with little adverse reaction. From the beginning there has been anecdotal evidence from doctors that patients who take HCQ for their autoimmune problems have handled Covid quite well. There is no evidence I've seen that HCQ or Ivermectin cause problems for Covid patients. So what is the justification for the reported suspension of doctors for prescribing them off label?
Which leads to the third major problem - the sloppy statistics with poorly documented rules for collection and the lack of granularity. Just today I saw where the stats from Hamburg were grossly wrong in asserting that the majority of recent cases were from the unvaccinated. The handling of Covid 19 by the medical and political establishment has been a hot mess.
> some real genetic evidence that rules out or pushes back strongly against at least some varieties of non-natural origin
That is much more narrow than what was claimed, and a team of scientists writing in a scientific journal know the difference.
> When a letter talks about "standing in solidarity" and "fighting disinformation" and only one sentence out of 18 makes a scientific claim, you should assume that it is mostly unreliable.
This is a strange way of reading that Lancet letter.
- It's in a scientific publication
- Written by scientists
- The only scientific claim made is that "this coronavirus originated in wildlife"
- They pepper that claim with a bunch of citations and erroneously state that those papers "overwhelmingly" support their one scientific claim
- The result of the letter is to push discussion of a legitimate line of scientific inquiry out of serious consideration by both scientists and the general public
Yes, there are the political statements in there. But there was a concrete scientific claim that appeared to non-experts that it was well-supported in the scientific literature. That lie was designed to be persuasive precisely BECAUSE it was made as scientific claim, not because it was an appeal to authority by scientists. That distinction crosses all the lines Scott lays out above.
But when scientists use their authority to peddle politics, they hurt their science a lot more than they help their politics.
But it was published by scientists, speaking as scientists, in a scientific journal.
The same letter had the chutzpah to claim "We declare no competing interests."
I get pretty angry when someone does something that obviously serves no purpose except to deceive or confuse, and then it gets defended on the grounds that "everyone" sees through it so it's not really a lie.
Like, there's a price tag that says $9.99, and then someone tries to pay $9 for it, and the clerk scornfully explains that it costs $10, what's wrong with you? The store is engaged in deliberately-engineered psychological warfare to confuse their customers! It's not *especially* effective psychological warfare; most people manage to figure out the actual price (eventually); but it's effective enough that stores are measurably making money from it. Getting angry when a customer is confused by the thing that YOU did with THE EXPLICIT GOAL OF CONFUSING THEM is like beating someone and then complaining that they bled on you.
If the REASON you refer to your tax increases as "revenue enhancements" is that it makes people get less upset about them, then it is obviously a lie, and it obviously matters--otherwise it wouldn't work! Claiming that it's not a lie, or that no one is fooled, or that no one who matters is fooled, is just an attempt to escape responsibility for telling lies.
There are situations where it's legitimately OK to say untrue things because you aren't INTENDING to fool anyone--jokes, sarcasm, fiction, etc. But if the whole point is to profit by impeding your audience's understanding, then this defense is not even slightly available to you.
True, very good point on the 9.99 thing. I never got caught by those cheap tricks, convert them so automatically it does not even bother me. But my GF is often caught, and you just made me realise I should not be mildly annoyed at her for that, but mildly annoyed at the supermarkets...
I think the primary harm is not from people forming inaccurate conscious beliefs about the price, but people having a split-second gut reaction to the first digit before they've even finished reading the whole number.
I suspect a lot of people who believe they are "never caught" by this are nonetheless being influenced to be statistically more likely to buy the thing compared to a counterfactual where the price tag said $10.
But I believe the subconscious nudging is on a continuum with the people who actually try to hand the cashier $9. The $9 people are just the ones where the trick worked better than the store would have preferred.
Even if it truly doesn't work on you, I think you should be upset that they tried. When manipulations work, you often don't notice; if you catch someone in a failed attempt, you should punish them as a deterrent against trying. (Compare: punishing an attempted pick-pocket who didn't manage to get your wallet.)
Unfortunately, many casually-manipulative business practices are so common in our culture that you can't find a competitor who doesn't do them. I don't feel our culture is sufficiently upset about this.
I don't recall seeing any "$X.99" prices in the (Australian) supermarkets for a couple of years. Not sure exactly why they stopped, but they seemingly did.
There's price discrimination via hide-the-cheap-brand, though.
You can also consider it as part of the social lubricant that allows a society of individuals with an enormous range of personal interests to coexist peacefully. In a large number of social transactions, probably most of them, the value received is not exactly balanced -- could hardly *be* exactlyt balanced. A certain amount of genteel obfuscation allows the relative gainer to appear gracious and the relative loser to save some face.
Id est, to take your simple example, when I buy a gallon of gas the cost of the gas is in the present but the value for me is in the future, so I'm a little grumbly about the transaction. Jesus! $70 to fill the tank! Grrr. Putting the price at $4.99 a gallon allows me a tiny bit of psychological self-delusion that I'm paying about $4/gallon instead of about $5/gallon. I *know* the truth if I think about it even for a second, and I've reconciled myself to its necessity, but the "5" is not staring me in the face the whole time I'm at the gas pump, so it's less annoying. The oil company is thus doing itself and me a slight favor by obfuscating the true price very slightly, so that it's less in my face and the transaction takes place with less annoyance.
We do this all through language. It's why the caring physican speaks of your mother's "passing" instead of using the brutal non-euphemistic word "death." It's why the teacher says you "aren't getting a passing grade" instead of the more brutal "you're failing." It's why that girl said "I think we should see other people" instead of "you're boring and I would rather cut my throat than commit to you." These things can be looked upon as "lies" because they are not unvarnished truth, but they also allow us psychological space to accommodate ourselves to some harsh realities. They are a very necessary part of how a species like ours gets along without incessant fighting over small (but painful) gains and losses.
True. Maybe worth to mention that overdoing it makes it a sarcasm, which is worse than brutal truth, it's rubbing in your face that not you are not only the looser in this transaction, but that the winner do not even fear you just a little and make fun of your helplessness. You are not the looser of the transaction, you are a looser.
Like the health minister in my country. I always though that he had a half-smile when he announced the new restrictions, contact limitations or lockdowns, punctuated by "I know it's hard" or equivalent....
Sure, people can be assholes, and power corrupts. I'm just pointing out Chesterton's gate here. Ambiguity in language exists not because generations of humans are too stupid to think up precise terms, or because they are always trying to con each other, but also because we use those ambiguities to help ease social tensions that would otherwise have us at each other's throats more than we are.
I don't disagree with you really, but I want to point out that the clerk isn't the party who is carrying out psychological warfare. They probably didn't even place the price sticker. I think they're justifiably irritated in this case. They're not trying to trick anyone.
If the corporate overlords showed up irritated at the customer, it'd be a different story.
Clerks get berated for a lot of stuff that is not their fault.
It's highly likely that a given supermarket worker has placed at least a few price tags; my understanding is that restocking shelves vs. cashier are more of an as-needed substitution than different job titles.
More generally, you're getting at one of the basic functions of bureaucracy i.e. to conceal the guilty party both in the physical and informational sense from the aggrieved party. "Throw your hands up in defeat" is one response to that, yes, but it's obviously not perfect given that it's literally letting the bad guys win.
Right, I wasn't advocating for throwing your hands up in defeat, I was advocated getting mad at the appropriate party.
I expect people to describe things in the way most beneficial to them, and trying to move away from this local maximum is near impossible.
Well what else? If as an organism I am not 100% all the time maximizing my personal welfare (or at least that of my genetically or memetically related tribe) then my DNA is nonoptimal and my germ line will be replaced by another that isn't. Or rather, it would already have been replaced a million years ago, so you're only going to find an individual *not* acting that way if they are some weird sport mutant.
Right, you'd need everyone to be completely fair and neutral and then someone else enforcing that.
I apply one filter to politicians and advertisers, where it's generally "I can't prove them wrong in a court of law."
I don't have much choice in politicians and advertisers, but I can choose my news media I listen to. If I get the wrong impression (after applying filter), they're wrong.
I don't think that this reductionist view that all our societal interactions are this strictly determined by genetics is correct. Genetics are certainly a large influence on behaviour, but a) genetics do not get tuned to maximise reproductive success but merely avoid too-serious reproductive failure, and b) our primary means of maintaining social structures is environmental, not genetic. Most humans are certainly capable and often do take short- and long-term actions that are not genetically optimal.
Agree. Good points. But instead of getting angry, I want to do as Scott suggests and figure out how to get better at finding the signal.
Is this price thing a metaphor or have you known of people giving $9 to the cashier?
I haven't personally known anyone to do this. I've read some allegedly-true stories mocking customers who made errors due to common deliberately-confusing sales tactics that worked "too well", but I don't recall which tactic(s) specifically, so I picked an example on the basis of how succinctly it could be explained.
The distinction is between "deception" and "lying".
"Everyone" agrees that "$9.99" is deceptive and that it attempts to gain money in a zero-sum fashion (i.e. extract it from the customer).
Most people, including me and seemingly Scott, agree that it's scummy behaviour (most of the rest are scum).
Where people disagree with you is on the use of the *actual word* "lie". It seems useful to be able to distinguish various forms of deception from each other, and "made literally-false statements" is a category most people feel should have its own word. The apparent consensus for the word to use for that category is "lie".
In that specific sense of "lie", it is not a lie. The price says "9.99", and you can pay $9.99 at the checkout. It is literally true.
I'm not defending it or anything; it's bad. It's just a different sort of bad than "lying".
I think the comment I replied to above is pretty clearly using "lie" to mean something other than "make literally false statements".
Also, "lie" is not a term of art with a precise technical meaning, but I don't think your definition matches common usage, either.
If you say something you believe to be true, but you turn out to be wrong, I don't think most people would call that a "lie".
I don't think most people consider metaphors or idiomatic phrases to be "lies", even though they usually aren't literally true.
Conversely, in my life I've heard many phrases like "lie by omission", "lie with your actions", "the truth is the best lie", etc., implying that "lying" does not require literal falsehood.
My overall impression is that, in common usage, "lie" means something much closer to "attempt to mislead" than it does to "make literally false statements."
Setting the semantics aside: It's my impression that people have strong instincts about sticking to the literal truth when they are SPEAKING (in a context where they might be accused of lying), but LISTENERS have no such instincts and basically only care about intent. I suspect the literal truth standard was evolved as a defense against accusations, not as a behavioral norm.
In the specific case trebuchet went a bit too far, but your reply was extremely general.
PolitiFact generally makes the distinction and awards something like "Half True" to "literally true but misleading".
There's also the information-theoretic way of looking at things: you can actually rule out a lot of possible worlds from "tells the truth misleadingly" if you are sufficiently careful, but information from a known liar doesn't rule out any.
I've not heard the latter two phrases you cite.
Believing that Covid was caused by a lab leak _is_ a conspiracy theory. Thinking that there is some, entirely debatable percent chance that it was caused by a lab leak is not.
I have yet to hear a single person say that there is 0% chance that Covid was caused by a lab leak. Yet, over and over again, I hear the type of people Scott describes in the article above insisting that it was "100% a lab leak.
That's because human thinking (and language, downstream from that) isn't well suited to explicitly deal with probabilities. Consider the very words "true" and "false" which naively seem to imply either 100% or 0% probability for some proposition, which is clearly unrealistic in most cases these words are deployed. What generally happens is that the most likely seeming hypothesis gets to be called "true", and everything else is "false" with increasing degrees of indignation/ridicule. When the previously "true" thing happens to lose its provisional status, this tends to generate much cognitive dissonance in the epistemologically unsophisticated.
In most context, believing means true with a large probability. >75% >90%? Not sure, and it's not often important...
The context where believing in fact means being sure (100% probability) is religion, and, as a likely extension, as markers sent to outgroups. I am pretty sure that when you are in the lab leak group, you will hear that people discuss about the (high, very high, almost sure but not 100%) lab leak probability, while it's those stupid sheep that insist that it's a 100% natural zoonose that jumped to human without any lab being involved because it would be racist to say otherwise...
It is unclear to me what you are saying here. I think you are saying that you do see people on the non-lab leak side who express supreme confidence. I am sure such people exist. Possibly, people are just more likely to "dig in" when they feel like they are on the losing side of an intellectual debate.
As far as belief being some percent (>75% >90%), that wasn't really what I meant. I find lots of people say they are 90% sure but won't no amount of contrary information will make them budge from that. Effectively they are 100% certain.
I say that, reading you, I suspect that you are in the lab_has_nothing_to_do_with it team, and that's why you think the out_of_lab opinion seems to be a largely monolithic bloc of believer closed to discussion, while people thinking that the natural zoonose is more likely are more nuanced. People from the out_of_lab team will just have the reversed opinion: it's the people insisting on no lab involved that are monolithic fanatics that do not accept to consider fairly contrarian information, while in their camp new information are processed and their belief of lab leak is updated to take account of new infos. Nobody vocal will ever go from a >50% to a <50% opinion of course, and the minute adjustment will not be communicated to the other camp because they would misuse it to weaken their opponents in the eyes of the non-vocal bystanders
Bullshit.
"We stand together to strongly condemn conspiracy theories suggesting that COVID-19 does not have a natural origin." is pretty saying exactly that.
https://www.thelancet.com/journals/lancet/article/PIIS0140-6736(20)30418-9/fulltext
Didn't read the article. Not referring to its contents. What exactly is it that I said that you consider BS?
I'm confused about how you're using the term "conspiracy theory" here. Both cases you describe seem like conspiracy theories. In case 1 (Believing that Covid was caused by a lab leak), the claim is that a conspiracy definitely happened. In case 2 (Thinking that there is some, entirely debatable percent chance that it was caused by a lab leak), the claim is that a conspiracy might have happened.
Either way, a conspiracy is involved. That makes them both conspiracy theories.
It seems like you're just using "conspiracy theory" as a synonym for "something that is false", as it has come to be used by a certain crowd recently. But that is ridiculous. Conspiring is one of the most basic, fundamental human behaviors.
How is case 1 (Believing that COVID was caused by a lab leak) a conspiracy theory? The claim is that an *accident* happened.
And, yeah, that the people responsible went full Bart Simpson, https://youtu.be/WTbgsoHDc24 , but that's hardly a "conspiracy theory".
The gain-of-function research, the covering it up and lying about it
Only in a sense where any statement about a concerted action by a group of people is a "conspiracy theory", since if they acted together, it's a "conspiracy", and if you think about it, it's a "theory".
But that's not what is usually meant by "conspiracy theory" and definitely not what was represented as "conspiracy theory" with regard to lab leak hypothesis. What was represented is that it's a near certainty that it wasn't a lab leak, and we only say "near" out of scientific politeness because for all practical purposes it is as certain as any other fact we know about our reality, there was never plausible reason to think otherwise, everybody who supports the lab leak idea are freaks and fringe operators, there was never any serious science behind it and any idea that it could be plausible or should be taken as a serious scientific statement is preposterous, and anybody who brings it up should be laughed out of discussion immediately. They did not say "0%" explicitly, maybe, but they came within a Planck distance to it and put a huge billboard there saying "The Truth is here!". So I don't think the difference matters.
I see what you have done here. Us laughing at the absurdity of your conspiracy theory and paying it no heed _is_, in itself, a conspiracy theory. Of course, your conspiracy theory actually involves a conspiracy to hide and cover up this alleged event and by all of the scientists saying, "It certainly looks like it could be from natural sources." While our conspiracy involves us rolling our eyes and not following you down a rabbit hole. [edit: \s]
See how funny it works - you start with "nobody claimed there's 0% chance", and then you proceed to mock supporting the idea there's a non-0% chance, and discuss the "absurdity of my conspiracy theory" - mind you, you don't even bother to establish it's a "conspiracy theory" and why it's "absurd", you imply it is a proven and forgone conclusion and the only thing left for you is to mention this as an obvious fact - and imply your laughing at it, and rolling the eye on it, is the only natural response, and call even adressing any of the concerns, even bothering to substantiate anything "following you down a rabbit hole".
And with all that, you still never said the words "there's 0% chance" - so in fact your original claim is technically still correct. That's a masterful work.
"We stand together to strongly condemn conspiracy theories suggesting that COVID-19 does not have a natural origin."
I guess you could argue about the syntax here. But I interpret this as meaning: "all theories that suggest COVID-19 does not have a natural origin (eg lab leak) are conspiracy theories and we condemn them".
One could also argue that collecting viruses from bats in some cave, growing them in the lab and then accidentally infecting someone is still a "natural origin".
Yeah, there's no careful reading of that letter that makes it not an outright lie. There's no way to say that scientists “overwhelmingly conclude that this coronavirus originated in wildlife,” when there was none of the direct evidence then or now that we'd seen from SARS1 or from MERS to support that conclusion. This letter was a blatant use of manufactured 'consensus' to spread outright lies and subvert additional scientific scrutiny.
This was exactly the kind of outright lie - in the Lancet no less! - that Scott is claiming you're not supposed to get based on his 'how to read the media' construct.
Wait, are there people who think the lab leak doesn’t originate from wildlife?
"Our phones are made with all-natural semiconductors and all-natural Li-Po batteries, with an all-natural Gorilla Glass 9 screen and an all-natural plastic case. Get into the all-natural game, or explore the all-natural metaverse at blazing speeds on any all-natural social media platform. Or hook up the headset for an all-natural VR experience!"
I thought they stopped making Gorilla Glass from real gorillas around version 7. The fake stuff is nowhere near as good.
Yes. That’s what all the fuss about “gain of function” research is about. Basically, the theory goes that the virus may have been originally from wildlife, but was intentionally modified to be more infectious in humans, and only then was it leaked.
To me "originate from" sounds ambiguous, it could mean leaking an exact copy of what they collected in wildlife, or improving the version they collected in wildlife (as opposed to designing a new virus from scratch). Not sure which one you meant.
I do not pay much attention to this, but it seems to me that people working in Wuhan were definitely doing the research of the latter kind, in general, and no one is even denying this.
The questions are:
1) Whether this is where COVID-19 actually came from... or whether they were working on a completely different virus that *didn't* leak.
2) Whether "people in Wuhan improving bat viruses to better infect humans" also included the American scientists... or whether the American scientists working in Wuhan were working on something different.
3) If the American scientists working in Wuhan were actually doing gain-of-function research, whether they were funded by National Institutes of Health despite the existing moratorium on such research.
And my impression is that the answers are:
1) They deny it, and it would be difficult to prove either way.
2) Yes.
3) *Technically* no; in the sense that yes those scientists got NIH funding, but on the paper that funding was meant for something different.
So the "no" side is saying "there is no paperwork about funding for gain-of-function research". And the "yes" side is saying "well, they *were* doing the gain-of-function research, and they *got* money from you, someone just made it seem on the paper that the money was for something different, but as we all know, money is fungible".
This is my impression I got from reading: https://www.washingtonpost.com/politics/2021/05/18/fact-checking-senator-paul-dr-fauci-flap-over-wuhan-lab-funding/ and https://www.washingtonpost.com/politics/2021/10/29/repeated-claim-that-fauci-lied-congress-about-gain-of-function-research/
"Did you fund GoF research in China?"
"No, we're much more responsible than that."
"But you funded the lab that did the GoF research..."
"Well, that was for something else. It wasn't for them to do the GoF experiments."
"A different project at Wuhan, not focused on GoF research?"
"No, it was for the same project. We funded the part of the project where they collect samples-"
"To do GoF research with?"
"Right. But we didn't fund the research."
"I thought the samples were part of that research, though. You have to have a virus to start with that gains the function."
"Well they didn't ever get the samples."
"But they got the money. So what do you suppose they did with that money?"
"I don't know, probably applied it to their current project."
"Which was gain-of-function research?"
"Right, but we didn't directly fund it."
They are condemning the *conspiracy theories* suggesting it wasn't natural. Not the *well-reasoned hypotheses* about how it could have been unnatural. There were a lot of crazy conspiracy theories about the topic and were rightly condemned.
You are changing the words used to draw conclusions - this is just another example of what Scott was discussing.
I disagree with this characterization. It looks like you're doing the thing you're accusing Watson of doing, namely "changing the words used to draw conclusions". The plain language of the letter in part states that they "overwhelmingly conclude that this coronavirus originated in wildlife,2, 3, 4, 5, 6, 7, 8, 9, 10 as have so many other emerging pathogens.11, 12"
Note that those last two references are of other pathogens that emerged without passing through the laboratory, and that is how they claim SARS2 emerged. The authors of the letter did not leave room to conclude they were talking about a narrow subset of laboratory-based origins. The statement was not as ambiguous as you're making it out to be.
I stand by my characterization as the following sentences explicitly call out conspiracy theories and praise scientific evidence.
"Conspiracy theories do nothing but create fear, rumours, and prejudice that jeopardise our global collaboration in the fight against this virus. We support the call from the Director-General of WHO to promote scientific evidence and unity over misinformation and conjecture."
You are strategically quoting the letter to make it seem like the signatories are concluding, when in fact... "Scientists from multiple countries ... overwhelmingly conclude"
Nowhere in the letter do they ever say, in so many words, that lab origination theories are also conspiracy theories.
Yes, but read the papers they are citing. Those scientists do not themselves "overwhelmingly conclude" what is claimed in the Lancet letter. Therefore, that claim "scientists from multiple countries overwhelmingly conclude" is made BY THE AUTHORS, not by those they cite. So either the authors of the Lancet paper are directly making the false claim, or they are falsely attributing the claim to others so it sounds like there's consensus about a thing they wish to say.
It doesn't matter whether I say, "Everything on the internet is true," or I say, "Abraham Lincoln once said, 'everything on the internet is true." Both statements are false. The second one tries to piggy-back on old Honest Abe to give me more authority, but if anything that should count as a second lie, not absolve me of telling the first lie.
Either way, they went on to make the positive claim - not attributed to those scientists from multiple countries, since it comes after that long list of citations - that this happened "as have so many other emerging pathogens." No plain reading of this letter supports the contention that they support a possible lab origin of the virus. They go out of their way to explicitly rule that out.
"Nowhere in the letter do they ever say, in so many words, that lab origination theories are also conspiracy theories."
Part of the confusion here is due to how you're using the term "conspiracy theory".
If scientists (presumably more than one) worked together to achieve COVID gain-of-function, and then it leaked (by accident or otherwise) and then they kept that fact secret, that would be a conspiracy. They conspired to commit a harmful act, and kept it a secret. Theories about this happening would be conspiracy theories. So the scientists are explicitly denying the lab leak.
You seem be interpreting "conspiracy theory" not as "a theory that a conspiracy happened," but "some crazy theory only wackos believe that is false by definition". There is a motte and bailey happening here. The motte is "I'm only saying that the *really* crazy conspiracy theories with no evidence, like flat earth or inter-dimensional vampires, are false" and the bailey is "any theory that involves anyone conspiring in any way is false by definition."
"They are condemning the *conspiracy theories* suggesting it wasn't natural. Not the *well-reasoned hypotheses* about how it could have been unnatural."
Any well-reasoned hypothesis -- really, ANY hypothesis about how it could have been unnatural, would be conspiracy theories. If it is unnatural, by definition, humans were responsible for it, and then they kept their responsibility secret. That is the definition of a conspiracy.
Despite how the media uses the term, a "conspiracy theory" does not mean "some crazy, unsubstantiated theory," it means "a theory about people conspiring."
It's not an accident that the media has tried to merge these two meanings together, however.
Again - you have declared that there is a single definition for a phrase that, in fact, does not have a single "True" definition.
The letter says "Scientists from multiple countries have published and analysed genomes of the causative agent, severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2),1 and they overwhelmingly conclude that this coronavirus originated in wildlife." That was a flat lie if, as James says, the emails show that some of the signatories thought a lab leak was a likely explanation.
My friends who are relevant scientists generally went from believing lab leak was likely to believing it was very unlikely due to the structure of the virus not looking man made - information that was not available at the start of the pandemic but rapidly became available.
Some of them have updated back to likely with the refinement to the theory that it's a natural virus from gain of function study rather than a designed virus - but that's also compatible with the Lancet letter which is only strongly against conspiracy theories that suggest that it's not of natural _origin_ - which a lab enhanced natural virus technically is.
People do sometimes legitimately change their minds due to available evidence.
> but that's also compatible with the Lancet letter which is only strongly against conspiracy theories that suggest that it's not of natural _origin_ - which a lab enhanced natural virus technically is
And if the letter had been explicit about what it meant by "natural origin" and that "natural origin" included the not-too-unlikely case of an animal virus which has undergone gain-of-function research (which surely they must have thought about at at the time) then the letter would have been very reasonable.
Lab Leak in itself suggest an accidental release of a non-genetatically-engineered virus. Designed bio-weapon virus was never seriously on the table, very early on virus made human-compatible through GoF was the hot topic. The absence of marker of direct genetic manipulation (vs GoF, which I understand as a kind of accelerated evolution under artificial selection) is imho a smoke screen, a diversion, which worked.
I agree that Scott is overly optimistic, even if he's right about the presence of some red line in the lying game. But as other mentioned, those red line are not only not official (Scott is well aware of that, hence the whole difficulty to guess them), they are not fixed in time, depend on the "expert" domain and in the end of each "expert". When you consider that most expert messages are not raw scientific literature, but the message of a specifically-picked expert reported by a journalist (which have other sets of red lines), guessing red lines become a loosing game. The CC example/warning of Scott is great: Sure, IPCC full report is only biased on interpretation and quite carefull/weasel not to directly lie. In fact, I think it's largely honest.... The policymaker resume is a very different beast already, and main media reporting is one level up...But 99% of the public see only the latest....Where are the red lines there?
Same for Covid scientific consensus...And even there, we need to be super carefull at what we consider the base level. I think it's direct technical articles targeted at other researchers, where lies and bias remain manageable...if you think the medical replication crisis remains manageable....But oped, summarizes and policy recomendations are on a very different level, even if they get published in scientific journals. Lancet gate anyone? And it seems the non-consensus researchers for covid are (still?) in larger proportion (and often have individual publication indexes / pre=2019 reputation) than the NC scienties of climate change. I say still because once politically loaded, being NC is becoming a poor carreer choice, so the local moral/group standard is subject to large external incentives that operate at career timescales, one of the largest factors behind red lines evolving. Red lines are imho largely fear of being caught pant-off by your peers...
"Red lines are imho largely fear of being caught pant-off by your peers..."....In fact, i'd like to push that further because it explains something I intuitively do: paradoxically trust the expert less the higher up in the hierarchy they are, and the more mediatic they are. In both case, it means their peer are not really the other scientists/experts anymore. It's the other managers/politicians in the first case, and the journalists/media in the second. Which have much poorer redlines than technical scientist...
One thing to consider is that Scott might not be overly optimistic, but that he never believed the Lancet letter, because it didn't cross the lines he has in his head, it is just hard to describe what those lines look like in writing. When all this was happening, (as far as I know) he never voiced his opinion on the lab leak theory, which to me says that he never thought the evidence was certain in either direction. I think Scott is well calibrated, but it is impossible to put in writing all the rules to need to follow to be calibrated.
I share your impression, but it's probably because I often share Scott's opinions on "trigger" subjects, or at least the opinions I attribute to him.
And that's likely because Scott's self-identified as Grey tribe, like I does. He seems left-leaning while I probably am right-leaning (European here, so trying to understand subtleties of US politico-societal categories is not easy), but still, we are close enough (so are most of the readers) to wonder if the well-calibrated impression is really good calibration, or just inner-tribe cozy feeling? I like to think that one of grey tribe characteristics is to be really scientific in the old-fashioned way, so to be especially calibrated to detect lies (non-facts presented as facts), so I lean to the first explanation...Still, even if i'm not mistaken by tribal blidness, it means that such calibration is not really possible (or at least much much harder to achieve) in other tribes. There you will not have any halo effect. Among the greys, your social circle will really encourage fact gathering and scientific method above other factors, and you have friends that will behave the same way and that you can rely on without doing all the work yourself - social trust is at least partially aligned with truth seeking. In other circles? Not so.
A blatant example is the Poverty EGG study: I don't think this piece could have been written by someone in the Blue tribe, even a blue tribe scientist.
And yes, I think that there are non-grey tribe scientists, and I think this is a problem, one of the factors behind the current issues even with scientific publication. Non-grey scientists means blue, red tribe does not really have any foot in the science playing field since the seventies, as far as I understand the tribes :=)
Most likely, reds just keep their heads down.
Greg. Gregory Cochran. In the blogroll. But yes, he admits the "red-pilled" need to keep their head down. He knows math, that may have made him last till to 2015, as a research associate.
I tend to automatically dismiss anyone who makes statements about a virus "looking man-made" or not, on the grounds that either he or I must be gravely confused about basic biochemistry. I can't think of any way to look at a DNA sequence, or a protein amino-acid sequence, or even a full 3D X-ray structure of a protein, and be in any position to say "Huh! Looks man-made..." I mean, unless there was a tiny (c) 2019 WIV stenciled on the fuselage somehow. These kinds of statements remind me of "Intelligent Design" lectures in which, say, the structure of a molecular motor is thrown up on the overhead and the speaker exclaims "Look at that! All these trusses and gearings -- surely that's designed by an intelligent mind..." I can't see much of a material difference between that kind of argument-from-astonishment and an argument that a virus looks man-made.
Of course, it could easily be some subtle higher-order analysis is being done here, maybe some kind of homology mapping to existing wild-type genomes or something, who knows? and so "looks like" is far more subtle -- but therefore far more open to interpretative differences -- than the naive interpretation of the world would suggest.
If someone showed you a particle of mRNA vaccine under an electron microscope or sequencer, you wouldn't be able to identify it as artificial?
Depends. If it had a bunch of pseudouridines in it, yeah maybe. But if it really is "mRNA" meaning not some closely-related compound we're calling "mRNA" for convenience so we don't have to tack on eight syllables of organic chemistry prefix, no I don't see how one could tell. It's a more sophisticated version of the popular delusion that there's a difference between "artificial chemicals" in your food and "natural ingredients" because your fructose molecule was synthesized in a strawberry plant versus in a stainless steel vat at Archer Daniels.
I believe there's more to it than the pseudouridyl. The sequences are more highly optimized than what's found in nature, e.g. lots of synonymous codons are replaced with more optimal versions that have more C and G in them. It's not impossible for such a thing to evolve but it seems it didn't do so in reality.
I credit microscopic epidemiology even less than the meter-scale version.
My understanding is that early gene-splicing techniques did leave telltale signs in the DNA structure, and it is the absence of those signs that was used as the basis for the claim that Covid couldn't be man-made.
However, newer CRISPR techniques (which the Wuhan lab would certainly have access to) do not leave these telltale signs, so the fact that this one particular older technique wasn't used, doesn't really prove anything. (And the people making that argument should have been aware of that.)
It was expressed more as 'if a lab was designing a virus they wouldn't have done it like that' rather than 'it's possible to tell it's artifical if for some reason they also tried very hard to mimic a virus of natural origin', fwiw. I'm not a biochemist myself so I can't directly evaluate the veracity and I might be getting important terms muddled.
Yeah as I said that one strikes me as in the same class as the ID folks telling me there's no *way* the human heart couldn't have been designed by a superintelligent God-being because just *look* at how cleverly all the parts work together. I'm always underwhelmed by arguments from incredulity, or by the inference of human motivation and/or insight from the products. The human tendency towards anthropomorphization is just far too strong to trust that kind of argument. It's like the fact that if I don't find my car keys in the usual place, I'm basically driven by instincts to believe somebody moved them -- even if there exists many perfectly plausible alternate explanations, like my memory of where I left them is faulty.
Basically, by "it doesn't look man-made", they mean "it's either produced by evolution OR it's man-made by someone specifically trying to fake it being produced by evolution".
A fully-synthetic virus would have no particular reason to be extremely similar to pre-existing animal viruses, while a natural virus or a virus artificially derived from it would obviously have such a resemblance. COVID-19 is very similar to bat coronaviruses.
Patterns of codon use are another example. The genetic code has 64 codons but only codes for 21 different things; there is redundancy. However, for a variety of reasons natural genes are not distributed completely randomly among those codons; there is information there. Usually, artificial methods of producing genes use different signature patterns of codons than nature does simply out of convenience.
This method cannot produce a "definitely not man-made" answer, because man knows all of these signatures and can fake them if sufficiently motivated, but it is worth an update that it doesn't come back as "definitely man-made".
You're aware of fluctuations, right? The explanation for small-town "cancer clusters" and other surprises in the random variation of small numbers away from their expected value. The SARS-CoV-2 genome is only ~30 kb long, and the part that peopel would intensely study (for the S protein) ~3 kb.
"Fully-synthetic virus" was only ever brought up as a straw-man to argue against. No sane person ever claimed that Covid-19 was "written from scratch" or that it wasn't closely related to existing bat viruses.
But if they took a bat virus as their starting point and then did genetic engineering on it to make it more dangerous to humans, that would still count as "man-made" for the purpose of the lab leak discussion. Even if they left 95% of the virus as-is and only added one or two "features". Likewise if they did gain-of-function through a "guided evolution" approach, without modifying the genome directly.
So "the virus doesn't look like it was written from scratch, therefore we can discount the lab leak hypothesis as a crazy conspiracy theory" is not an argument that one can make in good faith. Yet it was made.
I agree that this argument was made, and that it was not in good faith.
> My friends who are relevant scientists generally went from believing lab leak was likely to believing it was very unlikely due to the structure of the virus not looking man made
So why did they believe this was important? From the moment I heard of the Wuhan Institute of Virogloy I had a model in my head where natural viruses are bred in captivity and can possibly escape.
> but that's also compatible with the Lancet letter which is only strongly against conspiracy theories that suggest that it's not of natural _origin_ - which a lab enhanced natural virus technically is.
This sounds more like deliberate misidrection. "We are confident that it is not [this thing that was never a concer]".
At the time a lot of people were talking about it being a deliberate engineered bioweapon or attack, not an accidental experiment escape. That's the theory that the Lancet letter was shooting down, not the lab accident one.
The news article that purports to show hypocrisy on the part of some of the signatories of the Lancet letter seems to be constructed in exactly the way news orgs lie all of the time.
The evidence takes quotes out of context from emails that we can't see to make it seem like some of the scientists have high levels of confidence about a lab leak. But everything stated as confident in summary is then hedged in "may", "might, "could" in the detail.
Another great article. It seemed like you were humanizing our two political tribes to one another. A good cause!
Heartily agree!
Heh. Not the ultimate.
Not that I think you'll notice.
This comment is great; depending on which side you're on, either the article or this comment are extremely funny.
User banned indefinitely.
These types of subtle nuances in 'bullshit-detection' calibration seem like a skill that is apparently very difficult for a lot of people to attain, and I'm thus pretty skeptical of any efforts to teach this kind of stuff in public schools in the form of 'media literacy' (although would love to see evidence to the contrary), it seems just as difficult to teach as it is to teach someone to be charismatic - a lot of subtle nuances that are hard to communicate as bullet points, and rather are best represented as complex multivariate distributions. I'm curious if anyone feels like they were 'taught' how to be really good at this by anything in particular.
Formal schooling doesn't even do a particularly good job of teaching things for which there are well understood, formal frameworks of right and wrong, like in the case of mathematics or foreign languages. There's not much reason to think it would be particularly adept at conveying a formal curriculum of media literacy. But it's an important soft skill that applies across a wide range of disciplines so you shouldn't be purposefully avoiding it either.
History at my school in the UK when I was about 15 was pretty good at this - primary and secondary sources, historiography generally. Not the specific detailed complexity in this post, but it felt like a good starting point for me. I think it ultimately helped me detect the newspaper and Government lies leading up to the Iraq war.
Yeah, GCSE History in the 1990s went pretty hard on historiography and sources and I think benefitted a lot from doing so.
It also has the advantage over trying to do it with current media that it's less controversial - if you're trying to tell people that the current President or PM is lying, then parents will complain; if you're telling people that some nineteenth-century President or PM was lying, they won't.
It does help that we often have the papers from both sides of any 19th c controversy.
Also, all the secret ones and most of the private ones.
This was my experience though probably not until college (and not in the UK), but I definitely think it can be taught. Maybe a course on the history of political propaganda and the psychology of marketing should be required in high school now like civics was required a generation ago.
Rhetoric is also pretty useful for this purpose -- for mapping out the typology of logical fallacies and where they show up.
I think this would be hard to teach well for the same reason it's hard to teach people to detect lying. You can give some heuristics that will help, but a lot of it is System 1 stuff that you train through experience.
Not to mention you're also teaching liars how to step up their game with the exact same curriculum.
Even if it *could* be taught, a cynic might wonder if the authorities who design school curricula are actually interested in giving all students finely-tuned bullshit detectors.. Be careful with that thing, you might accidentally point it at our own side!
I believe my ideology can stand up to a finely-tuned bullshit detector. I assume other people believe the same of their ideologies.
We should teach people to lie. Make games where lying like this is a skill. The best way to defend yourself against an attack is to have practice using it yourself.
I don't think that's likely to help. If you try to teach students math, do they all learn the same amount of math? If two people both learn to lie, but one has a deeper understanding of deception than the other, the person with the deeper understanding of deception is systematically more easily able to lie to the other and catch attempts at lying.
Also, everyone already learns about how to lie *to some extent* without formal instruction. Everyone already has knowledge of how to lie, the question is not whether they know how, but whether there are gaps between their knowledge of deception and others'.
Could you reduce the extent of the gaps in people's deceptive ability by putting everyone through formal instruction in lying? Maybe, if you put everyone through enough of it, but teaching anything is also an opportunity cost, and I think even modest gains here would probably demand a heavily lying-based curriculum.
Yeah, you’re probably right. “How can we teach ordinary people how to reason accurately in the face of dishonest leadership” may end up being a silly question.
it's been pointed out that many debunkers came from the field of stage magicians. perhaps teaching kids stage magic, which would be interesting, would help them understand the same concepts but in a more intuitive way?
I think this would probably have the same problem as teaching lying directly, but more so. Some people will learn to be quite good at managing deception, others won't, and the goal isn't to turn out some people who're exceptional at managing deception, but to close gaps in ability.
Stage magicians are often skilled debunkers, but among people who study stage magic, how many people ever reach the level of professional stage magicians? In fact, stage magic might not even be teaching them skill at managing deception at all; you might observe the same thing if it were purely a process of selection, where the people who're most talented at deception tend to rise to the top of the field of stage magic. If that were the case, teaching kids magic to improve their abilities at managing deception might be like teaching them basketball to make them taller.
I think it's likely that learning magic at an elite level does cultivate deceptive skills, but I also don't think it's the case that dabbling in stage magic does much to prevent gullibility or credulousness, so I don't think some mandatory magic education would do much to close gaps in deceptive ability.
Growing up in rural Texas, I'd notice parents playing this game with their children all the time. How far can the parent stretch a truth into an outrageous fabrication, and for how long, before their kid notices? They do simple stuff when the kid is young (say, around 4), and it can get pretty deep when they're teenagers and they can dare to try against the parent. When the mark notices, it's great fun. And the game is *always* afoot. And it's useful, as an inoculation against shysters.
There's a particular flavor of this game I see in Texas that seems to extend across the southern US, and I fully expect analogs of it worldwide.
I've observed this sort of behavior in a couple different cultural contexts and suspect that it goes a long way towards inoculating people against gullibility.
Having also spent a lot of time in STEM environments and noting the high levels of credulity in people in those environments, I've also wondered whether constantly assessing the trustworthiness and intent of an informant competes with the ability to deeply process the content of the information. That is, it might be hard to learn algebra if you're constantly asking whether Lang is trying to pull a fast one on you.
I actually had an extended encounter online a couple years back with a serial fraudster. I cut off dealings with them pretty early on without any loss on my part, but I spent months trying to prevent them from taking in tens of thousands of dollars in Kickstarter fraud and catfishing a guy out of his life savings. I succeeded in the first goal, but failed at the second.
The catfishing victim was a Russian man, and when I started trying to convince him that his "online girlfriend" was bad news, he actually thought that *I* was the naive one. We talked (and argued) about this at length, and the impression I got was that, growing up and working in a lower-trust society where he was used to the idea that a certain amount of dishonesty is necessary to get by, he was actually way *less* sensitive than I was to the signs of "this person is clearly too untrustworthy to deal with." Because he was constantly dealing with people who were engaged in various sorts of duplicity, but not screwing him over personally, he just accepted the idea "this is how people normally behave," and didn't think "this is a warning sign that they might do the same to me."
In some respects, he was an unusually emotionally vulnerable and gullible person, and I'm sure the average person in Russia isn't like this. But for me, it really hammered home the idea that adjusting to a low-trust environment won't necessarily make you good at spotting deception, and some people will continue to be very bad at it.
this is also the case somewhat when you lived in marginalized circles. Things that are "red flags" for other people, often erroneously, are just common things- and being able to distinguish "non-conformist with some unusual personal issues" from "scumbag" can be a legitimate challenege.
That's not really the way science or math works, when you're doing it right. Those are fields in which doubt is institutionalized. The way to learn algebra really well is to doubt *everything* you read in the math book, and sit down with paper and pencil and test it for yourself. Aha! You say x + x = 2x, nonsense! Let's try a few examples, ratfink. 2 + 2 = 2*2 = 4, hmm, check, let's try 3...et cetera. That's the way to learn the subject thoroughly and well, and anybody who masters math (or science) understands this.
Same with professional communication. I don't write *anything* in a journal article that isn't backed up, either with data and calculation right there, or a foonote to acre-feet of data and calculation elsewhere, going right back to the origins of the field. I don't ask the reader to take *anything* on my mere personal assurance, because my default assumption is that he is a hardened skeptic and will not believe anything without very substantial six-sigma proof.
But... sometimes, realizing you have the right kind of institutionalized doubt can leave you bias to kinds you hadnt considered... or biases towards others who you assume are kept honest by the same commitment.
Or.... and this is a more complicated thing... not realize that certai types of communication where one is not literally spelling things out are not dishonest.
I'm sorry, I can't even parse that. Can you give me an explicit example of what you mean? If you're pointing out scientists (or specialists of any sort) are just as gullible and irrational outside their area of expertise as any other schmo, sure, of course, no question about it.
I've seen the same thing in lower income urban cultures. More susceptability to some kinds of conspiracy theories, but more of a sense of suspicion against hucksterism.
It's interesting to compare the notion of evidence in science vs. law. There is considerable overalap, but there are differences too. The rules in law have evolved to deal with game theoretic issues that sciences mostly scrub themselves clean of.
The citizen watching the news should mostly be using lawyer like rules rather than scientist like rules. This is natural enough when we witness a bunch of ex lawyers arging in Parliament, but what happens when The Science becomes the news?
I'd love an example or two
Since the game is always afoot, it's usually everyday stuff. Something like this:
"Says here in the news that some organs in the human body might be unnecessary today."
"Y'know, my uncle had his large intestine removed."
"Really? I thought you needed it."
"Well, guess not. They took out the whole thing."
"So how does he, you know, do his business?"
"They sewed the other end to his you-know-what."
"I still don't see... don't you need that intestine to, uh, absorb water or something?"
"Oh, that's just it - my uncle barely needs water, either."
"What??"
"Yeah, earlier genetic defect - he had some DNA missing. Had to replace it with cactus..."
...and you just go on from there. Part of the skill is making it sound perfectly ordinary, and that *disagreeing* would be weird.
>We should teach people to lie. Make games where lying like this is a skill.
Allan Callhamer took care of that almost seventy years ago. And the current Diplomacy game over at DSL may be coming to a close in a few more (game) years, so if you need a refresher course in lying, detecting lies, and establishing trust when you and everyone around you is a liar, feel free to sign up for the next one :-)
Or maybe start a parallel ACX game.
DSL?
"Data Secrets Lox", an ACX affiliated bulletin board. There were a few online Diplomacy games set up there when I was active, and a few more on Slate Star Codex back when it was active.
Thanks.
That's apparently the Romani approach, at least according to Anne Sutherland's book on them. It's treated as a game.
Interesting - is that the more recent book, "Roma: Modern American Gypsies", or the older "Gypsies: The Hidden Americans"?
I think probability and statistics is a prerequisite for this skill, but not solely enough in itself.
I would add to this good LSAT test prep classes. The reading comprehension sections are designed in part to penalize readers that treat phrases like "X mostly does Y," "X usually does Y," "X may lead to serious consequence Y," and "there is suggestive evidence from distinguished scientists at top universities that X could lead to Y," as synonyms for "X does Y."
There's a lot of widely available practice tests and it's easy to assess performance over time. I think after the first two tries anyone who has a competitive or vested interest in trying to improve their score will learn that there is a cynical mode in their brain they can and need to switch.
One of the most useful approaches to news and media I learned from The Last Psychiatrist back when that blog was active. "What do they WANT to be true?" I try to teach my kids to ask that question about every article they read and even to ask it about their teachers at school. I think it's a good framework for getting to the real facts in the world without getting too far into conspiracy rabbit holes.
Does make you pretty cynical, though.
An approach taught in my Indian philosophy class - read every article 3 times - first time without analyzing, second time arguing along with the author (thinking of examples which support the author), third time arguing against the author (thinking of counter-examples. logical fallacies) before coming to a conclusion
Most of the time "media literacy" stuff seems to focus on identifying sources and not on perhaps more important things like how things are phrased and what information is left out.
I think I improved greatly in this area via high school speech and debate competitions. These required me to develop an argument - often about real-world issues - that was then subjected to intense scrutiny and literally scored in a competition. It trained the skills of both sound, logical reasoning, but ALSO how to present your arguments in a compelling manner. As someone said above, practice giving misleading-but-not-lying speeches is a great way to see how it's done and learn to see it in others. And in a fake debate competition context where often you have been assigned to the side of the argument you don't even support (another HUGELY useful practice), it's low stakes so you don't have to burn bridges the way real world arguments do.
I credit debate with making me better at speaking and reasoning, more sympathetic to those I disagree with, and more able to detect and dodge rhetorical flourishes and misleading arguments. Strongly recommend it as a way that's usually available in American public schools to train critical thinkers.
By contrast, i would say from my experience Model UN is not good at this. People give the same speeches, and it's really about negotiating as to who is the leader and who can take credit and jump on bandwagons, nothing really about the arguments themselves, and its very, very boring.
I learned this by having bad parents that manipulated me all the time - it took me 30 years to figure out them, but its sure helped me understand a lot of other interactions.
Its not something I would particularly recommend.
I was also thinking 'bullshit-detector' and 'media literacy' are what Scott is talking about. No idea, if it could be taught. But let's try! I read an interesting article years ago about things that can be learned, but not taught. Things can be learned by observing experts and experimenting, but not formalized. The example in the article was Chick Sexing (determining the sex of baby chicks). Something like that could probably still be taught in schools. It would just have to be showing lots of examples and having students read and analyze things.
I've argued for a long time that the critical skill not being taught is evaluating sources of information on internal evidence. Conventional school anti-teaches it. There are two sources of information, the teacher and the textbook, and you are supposed to believe what they tell you. Browsing the web, better yet getting involved in arguments online, is better. Anyone not braindead can see that the web is an unfiltered medium, hence the fact that someone says something online is no evidence it is true, so you need to develop ways of deciding what to believe. If you do a bad job of it you get embarrassed when you have been arguing that Adam Smith was in favor of public schooling, because someone said so online, and someone points you at the actual passage that makes it clear he wasn't.
Charisma- mostly by observing other people who were charismatic over long periods of times and mentally taking "notes"
but its definitely not a "bulletpoint" thing- its a "mode" of interaction that requires sorts of "parallel" information that can't be easily learned from the way rationalists usually learn things.
A lot of it requires being able to use compartmentalized beliefs- like how to "intepret" other's signlas in a way that suits your own purposes and then shifts the conversation in that direction.
A caveat, I am "geschwind type" neuroatypical, which i see as the opposite of autism on the spectrum- still it took time to get from "non-verbal information I am consciously aware of" to "mode where I can respond to that dynamically without actively processing it"
Excuse my poor typing.
Being able to model minds not just on a "deconstruct their reasoning" but on an emotional level also helps greatly. Maybe practicing that?
Re the Swedish piece-they have rules about what kind of research is ethical, and you have to have certain kinds of permits. The argument is about whether the authors followed the rules. I think they did the follow the rules, and the prosecution is foolish-as the initial prosecutor did! For Scott to make this about "the establishment"--just really weird. Certainly doesn't make the point it seems he's trying to make.
Scott is suggesting that the establishment is more likely to *notice* this sort of infraction - the fact that rules weren’t perfectly followed - if the establishment doesn’t like the conclusion. Had the results been different, nobody (or fewer/less credible somebodies) would have complained to the authorities so there would have been less reason for an investigation to happen. Bias is introduced both at the reporting level (if one side is more likely to complain) and at the response level (if one side’s complaints are more likely to be taken seriously and turn into prosecution) even if the underlying rules about what *should* happen seem perfectly clear and objective and even-handed.
What do you mean by "if the result had been different". It seems reasonable to think that if the results had been that native, Caucasian Swedes offended at a disproportionate rate, it might have been just as likely for there to have been an investigation, no?
If they were equal, I predict they wouldn’t have at all and the point follows through.
We're each entitled to our own unfounded opinions....
Sure. But I wanted to know if you would predict an investigation if both were found equal?
I think your example (natives found more violent than immigrants) is avoiding the strongest form of the argument. If you do think an investigation would happen if both groups were found equally violent, then that clarifies the disagreement.
Ah, that's helpful! Thanks! I don't know how I'd normalize the distribution. In that case, I think I agree with you. I'd have to go back and look at the story again to be sure, and I don't have the time right now.
But that's basically just saying that studies with interesting results get more scrutiny than uninteresting studies. That's obviously true - interestingly-sounding studies get more media attention and reach more people so there is more chance they get into the sphere of attention of some regulator. That's very far from saying that studies whose results support anti-establishment ideologies get extra regulatory scrutiny (still a plausible claim TBH).
No, it's saying that studies which confirm a certain set of priors tend to skate, while studies which tend to cut against those priors get scrutinized.
I refuse to believe that the Swedish authorities are actively poring through all the published papers that *anyone* has written looking for this kind of violation.
The only reason charges were brought in this case is that some busybody - likely a fellow academic - ratted him out to the powers-that-be. And the most obvious reason why that might have happened is that the complainant found the paper’s results offensive and was looking for a way to discredit the author for culture-war reasons.
If the paper’s findings are what caused a complaint to be filed, then a paper that either found no significant disparity or found a disparity in a direction that *reinforced* the dominant narrative would have gone unchallenged or at least would have been challenged *less* forcefully by *fewer* people than this paper was…which would substantially reduce the odds of charges getting filed.
That conclusion is inherent in the phrase “dominant narrative”: what it MEANS for a narrative to be dominant is that support for the narrative passes unchallenged while opposition to it does not, no?
The only way charges would have been filed if the paper had had different findings is if this were *personal* - somebody had an existing grudge against this particular researcher for some *prior* offense and this paper *incidentally* offered them a chance at payback. But my money’s on the other option. If we had a parallel world to run the experiment in I’d offer 20:1 odds the finds-no-difference paper passes muster with no legal challenge.
In modern society, a lot of things are illegal. Most people have done something for which they *could* be jailed.
Most people are *not* jailed, and not due to courts acquitting them, but because they're never indicted in the first place. Some of this is due to nondetection, but a large part is due to prosecutorial discretion. That is to say, a prosecutor can choose what he/she does and does not take to court. Note that there is very little accountability for this discretion; cases that don't go to court are normally invisible, and cases that do are usually seen as reasonable because the suspect is guilty (due to the first point: everyone is guilty).
When everyone is guilty but not everyone is prosecuted, prosecutors can use their discretion to pursue ideological projects by selectively jailing people they don't like. This is what Scott is alleging; that prosecutorial discretion would have spared someone whose study had the opposite result. (This is *very* hard to confirm or refute, which is part of the problem.)
"I think, I think, I think." You know, Scott, if you had even an iota of data here, instead of your unbounded faith in your own gut intuitions (aka priors), you might have something valuable here. All you're saying here is "If my unsubstantiated belief 1 is true, and unsubstantiated belief 2 is true, boy is that ever outrageous!"
For those interested:
I think any researcher who found that immigrants were great would not have the technicalities of their research subjected to this level of scrutiny, and that the permissioning system evolved partly out of a desire to be able to crush researchers in exactly these kinds of situations. I think this is a pretty common scenario, and part of a whole structure of norms and regulations that makes sure experts only produce research that favors one side of the political spectrum. So I think the outrage is justified, this is exactly what people mean when they accuse experts of being biased, and those accusations are completely true.
He's quoting directly from Scott's article to complain about the lack of evidence for that specific claim.
This doesn't seem to be necessary (i.e. doesn't seem to be making any sort of interesting claim which would justify the ways in which it's a bad post).
Clearly you're not a philosopher.
Well, I'm sure a philosopher wouldn't think that I'm intentionally a philosopher.
I suspect one could readily build a philosophy of philosophical unintentionalism.
Be cool, man
And again, he fails to distinguish between opinion pieces in the Washington post and news articles.
"Finally, the Marx thing was intended as a cutesy human interest story (albeit one with an obvious political motive) and everybody knows cutesy human interest stories are always false." It seems he kind of does. But largely, I agree that there's more nuance to be had in dealing with the various tentacles of a given media apparatus than was conveyed here, esp wrt to Fox
I cannot find the word "opinion" anywhere on the Washington Post article about Lincoln. The URL suggests it is in the "history" section. I agree there is some vague sense in which it is more of an "opinion" piece than the election reporting, but separate from an obvious THIS IS AN OPINION FLAG, that's exactly the kind of not-universally-understood heuristic I'm talking about.
Would you call the poorly-reported childhood EEG study I blogged about recently in the NYT an opinion piece or not? If yes, how is it different from any other science reporting?
For the Lincoln/Marx piece we see it's on "Retropolis", and "Gillian Brockell is a staff writer for The Washington Post's history blog, Retropolis." So, we are on a blog, which is very much not an "article."
These distinctions are really important for understanding what you read.
I see this as pretty much reinforcing Scott’s point rather than diminishing it, though. “The Washington Post will tell different, somewhat more brazen lies in their blog section” is the sort of mostly-reliable heuristic that you need in order to have any chance of discerning the truth value of the news.
Yes, it reinforces the point of this piece. I just wish Scott would take his own lesson to heart and make an effort to use precise and correct words for published content, as these really do matter.
In a sense I think it almost shows the opposite. The Post thinks of the distinction between the blog and the news as the kind of transparent and legible distinction that makes things clear. But there’s a lot of redundancy in the signal too - the blog and the news have different kinds of stories and are written in different styles, so that even people who fail to pick up on the transparent signifier can still develop the kind of useful heuristics that lead them to understand where different levels and kinds of credibility attach.
How likely is it that the average reader is making that distinction? I also don't see any reason to excuse the Washington post for publishing potential falsehoods just because its on a "blog". We are right now commenting on a blog. Would we excuse Scott if he published complete falsehoods and lies?
We understand that Scott's blog is Scott's responsibility only; we don't go and blame SubStack for lies on Scott's blog. Similarly, no one here is defending blog author Gillian Brockell.
A "blog" is the author's writing with minimal oversight. If it got a bunch of editing and fact-checking it wouldn't be a blog. The WaPo would call similar writings that had been through the full editorial process something else like say "features."
Now the WaPo doesn't literally have zero responsibility for the blog - they chose to hire this person - but the organization don't "stand behind" blog writing in the same way that they would for real news articles.
But again, how likely is it that the average reader is making that distinction? I don't think your interpretation is the one held by most people. Blogs are no longer just places to write opinions or thoughts, they are often at the forefront of new reporting and are cited by mainstream news outlets all the time. I would also argue that, thought its called a "blog", this one in the Washington Post doesn't actually fit the commonly held view of a blog as a private place for one or a group of people to publish their writing. In this case, WaPo controls everything about the blog except for, presumably, the topics covered in it. But publishing, marketing, distribution are all covered by WaPo. This seems much more like a column to me or at least a distinction without a difference.
It's interesting that you analogize to a column. I think even a below-average reader understands that George Will's columns are not backed by the Washington Post in the same way a news article is.
I don't know for sure, but I'd guess that the average subscriber to the Washington Post understands the difference between the blog content and the straight news content, but the typical person just clicking on WaPo links via Twitter probably doesn't.
With respect, I'm not going to take the time to find an unlinked story. Also, are you calling for big, bold labels "THIS IS AN OPINION" and "THIS IS A NEWS STORY". Because that was what I took away from your post.
> I'm not going to take the time to find an unlinked story
I thought you already found it in order to make your initial comment. You said:
> And again, he fails to distinguish between opinion pieces in the Washington post and news articles.
If you never found the story, how do you know it was opinion?
Back in the old days.... we had news pages and opinion pages and they showed up consistently in their respective separate places in the newspaper. And TV news had its own very structured and consistent format. 60 Minutes was like revolutionary for providing a mix of (sort of) news reporting, analysis, and some other just goofy shit all on the same show. And in terms of news consumption, the average (U.S.) public took in maybe three sources of news at most, all with these stable and familiar formats.
And then the internet and endless cable "news" TV happened and we've had two generations now of people who don't have these earlier reference points deeply ingrained into them. Stuff of both flavors -- news and opinion -- shows up everywhere all the time. And then print media and cable news, now needing to provide ever more flavors and variety of content spawned all kinds of in-between-y formats that get called things like "essays" or "analysis" or "blogs" or "topical newsletters" or "explainers" that are neither news nor opinion pieces, and are often huge amounts of nonsense chasing ad revenue.
I remember when Vox first started publishing "explainers" I would have to refrain from emailing them my ranting frustration that their explainers had all these subtle biases imported into them and how much more insidious that is than other kinds of "news" reporting because people didn't have the skills to interpret the bias of the explainers where we sort of had skills to interpret the bias of news. But that was a long time ago now too.
Scott seems to be writing from inside the first generation that didn't experience the predictable clarity between "news" and "opinion" as it played out in the more limited forums we had back in the old days. And so Scott seems less clear about the distinction but also the distinction is so much less clear than it once was, and it's only Gen X and older who would have the same kind of internal reference point for how this all used to feel, which is so hard to describe now relative to how it actually is.
(and I did have to walk to school backwards uphill with cardboard strapped to my feet)
A dimension of this mess that Scott is not touching on here is the whole „so who is an expert“ quagmire. Think of the Covid fiasco, and the plethora of „experts“ on all sorts of things it brought out of the woodwork. For people who are struggling with understanding a complex situation, it’s often not a trust the experts vs. distrust them situation: it’s „who are the experts in the first place“?
Exactly. And once you have in your toolbox "this guy is not really an expert" + "he's not really lying literally" + "he's literally lying but that's just part of the game", seems like you have too many degrees of freedom when calling things "not total bullshit".
Exactly. Look at the google scholar page of someone like Peter McCullough. Sure, he’s not a virologist, but he is an expert in a relevant field and frankly has the credentials and publication history to back that up. Yet, he goes against consensus expert opinion. I’m not saying he’s right, but I can’t exactly dismiss him as not being an expert. Now, medicine isn’t my area of expertise, but I am a scientist. I have the sense to at least look at someone’s publication history. The average person is not going to be able to do that.
So you have another issue here: consensus vs dissenting voices. I’m personally in favor of hearing out dissenting voices, but I must admit, I’m having more trouble establishing what’s true and false in the current climate than I would like.
"That's probably a bigger lie (in some sense) then one extra mass shooting in a country with dozens of them"
bigger than
And
"people can’t differentiate the many many cases where the news lies from them from the other set of cases where the news is not, at this moment, actively lying."
Should "from them" be "to them"?
It's not literally true that "experts in the field won't lie directly". There are two ways in which experts in the field will totally lie, and do so all the time. First, they'll be mistaken (maybe you don't count that as a lie, but from the point of view of an observer it can be functionally the same). For any proposition X, there's some distribution you get if you ask experts "how likely is it that X", and there'll be some (hopefully small) fraction of experts who are just wrong. Second, there's some fraction of experts who lack scruples. It can be a small fraction, I don't care, but it's nonzero, and so you can always find an expert to go on a podcast and blather any claims that you want.
This wouldn't matter, except now the other experts (who aren't grossly mistaken, and who have scruples) are likely to become in a sense complicit. "Not lying" is much easier than "calling out a lie". There are many reasons not to call out a lie --- political inconvenience, being associated with icky people who also call out the lie, not having enough time. People don't generally think (even if they claim otherwise) that there's some strong moral requirement to put your career on the line to correct some false statement made by a supposed "expert" in a paper, or online, or in the news. It's easy to justify inaction by saying "oh, the lie was of little consequence", without noticing how often that really means "of little consequence to *me and mine*".
The result of all of this is that if you consume the news, or the scientific literature, you can in fact be consuming outright lies. The small fraction who are grossly unethical, or outright stupid, make the lies (crossing the line!), and then others are reluctant to do anything about it (not crossing the line).
This doesn't invalidate the central idea of "bounded distrust". It's still the case that a sufficiently extreme lie ("the normal distribution posits that there is a normal human", or whatever that was) will receive substantial pushback --- although note that even there, people were reluctant to be associated with Razib Khan, and so took their names off of the petition! But this does move the invisible line of "things that are just not done" a bit further in the direction of dishonesty. What matters isn't so much what the median expert will *do* as what the median expert will *tolerate*.
From my viewpoint, it looks like the median expert will tolerate quite a lot of dishonesty, as long as it's "not of any consequence (for me and mine)". This varies by field, of course, as some fields have more of a culture of rudely calling out bad claims than others.
Other collective effects also reduce (in my eyes) the trustworthiness of amorphous "the experts". Just one example: who are the experts? Unscrupulous and incompetent researchers can create, by exploiting the politeness of their peers, an entire body of poor literature (here I'm thinking of "near-term quantum simulations", but there are plenty of others!). Now if I want to query the experts about this literature, who do I ask? The people who write papers about it? Not a good strategy, but it's very difficult to know who the correct expert to ask is. Should I ask "the inventor" of mRNA vaccines about the properties and effectiveness of the Pfizer/Moderna vaccines?
The upshot of all this is that if I have a friend who knows something about a field, I'm not particularly sensitive to all these collective effects, and I can reliably extract quite a bit of signal. If I'm relying on observations of the behavior and claims of "the experts" and "the journalists" and "the politicians", then even under optimistic assumptions about their individual honesty and competence, the amount of available signal is substantially reduced.
The World Socialist Web Site did a fine job of rounding up five famous American historians to denounce the bad history in the New York Times' "1619 Project." So, it's not always impossible to get real experts to speak out.
Sounds like one of those "Fifty Stalins" things where it's only safe to attack from the left.
the cited scientists and historians werent generally leftists.
yesm there is a political reason, because "wokeness" runs counter to those who believe wholeheartedly in traditional "class war" leninism/marxism
Shared lies signal group loyalty.
This isnt a great example because there actually were a lot of historians who took issue with it and the NYT's promotion of it.
The concept of "everything" seems to very easily morph into the concept of "anything" in people's minds without them really noticing the difference, i.e. "You can't believe everything you read" becomes "You can't believe ANYTHING you read" and is defended with arguments that only support the former statement, not the latter.
If you don't have a reliable way to tell the non-trustworthy things from the rest, then they are actually the same. "Some of the oranges in this bowl are dangerous to eat" implies "you shouldn't eat any of the oranges in this bowl", if there is no good way to tell the bad oranges apart from the others.
If you assume there's no way to tell the difference between any two things, then you're defining "everything" and "anything" to mean the same thing, which works for your argument but plainly contradicts the meaning of the words.
I'm not defining them to be the same thing, I'm just saying that as long as I cannot tell the difference I have no choice but to treat them the same. I'm not sure what we're actually disagreeing on here.
Martin is right: he's not conflating everything with anything in general, but only on trust. If you have a bunch of facts and you know some are true and some may be false, without knowing which is which, you are forced to say all may be false so none can be fully trusted.
not if you do the bayesian thing where you have different confidence estimates for beliefs and correlate that with risk/reward benefits of having the wrong/right belief.
Like "some berries are poisonous" might be a good reason not to eat random berries off bushes if you arent sure which are which.
On the other hand, something like "i cant be sure how much mold is on this fresh fruit, so i'm not going to eat any fresh fruit" is a bad heuristic to have.
On a but of a joke note, I do like to imagine people playing this game with dangerous recreational drugs.
Is trying fentanyl a good idea or a bad one? I use to think about how anyone could possibly think trying heroin was a good idea; like thinking about it and saying "gee, is this likely to result in a good or bad outome?"
Of course the reality is by the time people are ready to try heroin or fentanyl they are already deep down the rabbit hole of bad heuristics
Not so. For example, if you're starving, "some" implies you should risk it.
True -- but still, it would be fair to say that all the oranges are dangerous. If one orange in a bowl of ten is poisonous, but I can't tell which is which, then from my perspective each of the oranges has a 10% chance of killing me, and is thus a dangerous orange. (Even if I may still need to risk it, if the alternative is certain starvation.)
Likewise, if 10% of news articles are highly misleading to the point of being basically false, but I don't have a good way of identifying the misleading ones, then it's fair to say that from my point of view all news is untrustworthy.
In both cases, the fact that there may exist other people who are better at identifying which of the oranges is poisonous / which of the news articles is trustworthy, and who can thus safely consume the rest, is of no help to me.
unless you know and trust one of these people
I think it's worse than that. "You can't believe everything you read" becomes "You can choose which things you want to believe and which you don't".
Even a big media skeptic forgets his skepticism the moment he reads something they really _want_ to be true.
John Oliver offered an example of this about five years ago.
https://youtu.be/0Rnq1NpHdmw?t=855s
John Oliver is the worst. Apparently clueless to the reality of everyday humans.
how so?
if you encounter the "you can't believe anything you read" in a written context, you can just make their brains explode captain kirk/godel style
'ambiently watching the TV at the gate.' I've never seen ambiently used like this, is it a typo or actually idiomatic?
Idiomatic for me! It's like when there's something playing in the background that you're not intentionally paying attention to. I'm also reminded of Paul Graham's coinage of "ambient thoughts" from http://www.paulgraham.com/top.html .
For me the strangest thing about that sentence was the existence of a TV at an airport gate. Hospital waiting room, sure. But I've never seen TVs at the airports I most frequent. Now that it's been brought to my attention, it looks not so different from a hospital waiting room on the relevant properties, so I wonder why they are not there (although I personally prefer their absence).
I see TVs at airports all the time (invariably playing CNN) and often have to do a lot of work to find a spot where I can sit without seeing or hearing them.
What airports do you go to? For me, the biggest advantage of having the status that gets you into the airport lounges is that it means you can find a space that doesn’t have a tv playing CNN airport edition.
I almost only travel within Europe. I vaguely remember watching part of the Olympics at an airport when I visited Canada many years ago, so indeed I have watched TV at an airport at least once, but these don't seem to exist in the places I know on this side of the Atlantic.
It's intriguing that TVs are added to waiting room-style places even though this is not an obvious boon, since there are costs associated with having a TV on 24/7 (or however long the waiting room is open). In cities with exactly one airport, you can't really decide to visit or avoid an airport on its TV-having status, so this shouldn't really be a consequence of competitive pressure... unless the whole point is that this started in closely clustered airports and grew from there?
Really, I'm intrigued. Why do something rather than nothing? Were there riots or loud talking in waiting rooms without TVs that lead to someone having this idea? Was it that someone working at the doctor's office / airport / place was bored out of their mind and convinced their boss to install a TV and this somehow became mainstream?
TVs can also be used to show information and notifications, and I suppose, once you have those, you may as well show something on them.
I have heard that CNN pays airports to have TVs, which must of course be constantly tuned to CNN.
I'm not sure if it's true or not, but I wouldn't be surprised if it is.
Probably a transition from magazines, as there just arent as many magazines published anymore and especially ones that are "politically inoffensive?"
I wish they'd get rid of the tvs. They do announcements over loudspeakers, which I can't hear because the tv is talking over it.
What?! Then it's even worse than I'd thought. I was assuming the TVs were muted, which is my experience in doctor's offices around here.
Great article examining something common that usually isn't thought about explicitly. I think trust is in most situations contextual - I know people who I'd trust not to steal or lie but not to show up on time etc.
For the political implications, I think trust and power are closely connected, because in a sense if you trust some one you give them power over you, since they can then control what you believe, which will then influence the choices you make.
Where this gets dangerous is not so much people giving up and trusting no one. It's when someone comes along with the message "all institutions are bad, trust no-one but me" and people believe them.
Because at that point because trust=power, they have quite a lot of power. You could do almost whatever you want and people will still support you. For example you could say that you didn't lock up your political opponents - they committed crimes. Or that you didn't overturn the election - you just found fraud. Or that you didn't start the war, or the war was necessary.
With you mostly but I was waiting for you to acknowledge you were wrong about Ivermectin and why
That you didn't says you haven't moved with the times and although your general perception of the paradigm's workings are correct the specifics have changed and that is why you are still applying the old rules
See Adam Hill's Zoom today for example
I can't figure out what "Adam Hill's Zoom" is supposed to refer to. Link (or, if it's a video or podcast, summary?)
Sorry..I meant Andrew Hill. The Zoom call with Tess Lawrie. I was watching the actual zoom call just before reading this. I can't find the video atm but there is a part transcription included here: https://www.worldtribune.com/researcher-andrew-hills-conflict-a-40-million-gates-foundation-grant-vs-a-half-million-human-lives/
I regard conspiracy theorists a bit differently. This theory basically says that media is an interpretive process, effectively an act of mutual interpretation between broadcaster and receiver. The broadcaster is trying to convey what they want the other person to believe. So far we agree. But you pose the receiver is trying to determine what is true and what is false in the broadcast. Conspiracy theorists are people who are doing this badly.
I don't think that's true. I think the receiver is trying to determine what they should personally do. They're not actually invested in truth or the institution of news. (I suppose this makes me overly cynical since it means NEITHER side is invested in truth.) For example, take the vaccine stuff. The news is trying to broadcast the message the vaccine is safe, necessary, etc in an attempt to get the person to take the vaccine. The receiver isn't fundamentally trying to determine whether any of this is true. They are trying to decide whether they will take the vaccine. Whether they should socially pressure other people to. And so on. Part of that is undoubtedly determining whether the news is telling the truth. For example, if the news reports the vaccine makes you grow wings and no one's growing wings then that's pretty relevant. But only a part and it's certainly not a necessary condition.
Once a person makes a decision they construct an epistemology that justifies this decision. Or alternatively they already have an epistemology and it creates the belief. That's complex. Regardless, this is true for both broadcaster and receiver. Conspiracy theorists are people who construct epistemologies focused around conscious deception (a conspiracy). Like most epistemologies it's communal rather than individual. This creates a social-cultural network/pattern. Which of course the broadcasters and non-conspiracy theorists have too.
The conspiracy theorist's centrally unfalsifiable claims is both powerful and handicapping. Because it's unfalsifiable and often totalizing ("everything is Illuminati!") it makes it difficult for them to effectively achieve their ends. Even when they win it often doesn't achieve what they want. On the other hand, this is an ideal way to spread and maintain itself. Someone with concrete goals ("get everyone vaccinated") must eventually come to their end. Someone with a vague unachievable goal ("eliminate the Illuminati") gets to flexibly gloss over policy details and apply their lens to every situation. And they never has to deal with the goal being achieved. Victories and defeats occur but never the ultimate victory or defeat. And this fight can pay pretty direct benefits to its members. Sometimes even on a society-wide scale.
In summary: Conspiracy theorists are not failing at being mainstream. They're succeeding at being conspiracy theorists.
For the record, while your overall point is interesting, the choice of examples (Fox News, immigrant rapists, ivermectin) is sort of annoying and leaves an aftertaste. Those topics are sort of emblematic of an intellectual niche which is, to be blunt, AMPLY covered by other outlets.
No, you just don't like these things so you think he shouldn't have brought them up. But please, PLEASE, show me these other outlets where immigrant rape statistics being effectively censored has been amply covered? It REALLY REALLY sounds like you just don't want people knowing these statistics, especially given your track record on this blog (i.e. literally calling mainstream behavior heritability research "1920s eugenics")
"Show me these other outlets": Check your bookmarks list, I expect it's overflowing with them.
Much less of this, please. I am also interested in an answer to the question. I think there's a small chance, but nonetheless worth investigating, that you might not actually be able to find all that many, to your own surprise.
I'm writing this comment because there is a lack of a report button. If you're reading this, Scott, this is a report.
What other examples would you have used? I need to use something where the media is biased/lying and people are angry about it, that kind of by definition means culture warrior-y stuff that makes lots of people angry.
For the record, I personally think your examples are absolutely fine
I like ACX because you always do a good job of finding interesting examples that are off the beaten path, things I would never have seen or thought of.
I also think they are fine. Another case where you can get angry is when medias speak about something you really know. Sometimes because it's your area of expertise, sometimes because it's about you, a close one (or just simply someone you personally know) or your neighborhood. This last case is super enlightening because it's so directly brutal and gives you a lasting lesson about how much you should trust the media.
A frequent reaction is to want to punch the journalist in the face. Not always, sometimes it's the other way around, but to really appreciate the reporter in the second case you need to have experienced the first kind, just to see how bad it could be :-).
Unfortunately, this kind of personal expertise does not leads to good example, as people not involved have by definition very little knowledge apart from what is reported.
I think that your examples are fine within a certain context and framework. That said, when you say "What’s the flipped version of this scenario for the other political tribe?" there's an implication there that this is an essentially equivalent case, only flipped for tribal politics, but here we can see that the left lies while the right does not.
I didn't take you as intending to convey that impression in your essay. But, from the time I've spent on the SSC subreddit from before the culture wars content was split off into its own sub, I honestly do think that a non-hypothetical, and probably quite large portion of your reader base would interpret the essay in exactly that light. Either "Scott chose these examples because he wants to make the point that the liberal media lie more than the conservative media, because he's a conservative and naturally wants people to think that," or "Scott chose these examples because he's correctly pointing out that the liberal media are fundamentally more dishonest than conservative media."
I don't agree that people should think that Scott is saying the liberal media lies more or is worse than conservative media. He starts off the post with the assumption that Fox News is loose with the truth and that most people agree with that assumption. I think the main point he's making, one that has been reinforced over quite a few articles, is that almost ALL of the popular media outlets are fairly dishonest and how much of a problem this is.
P.S. this is my first post after being a long time reader.
I don't think people should think he's saying that. But my impression is, this isn't just a theoretically plausible way people might read the essay, but a way that a significant portion of his audience does read his work in practice.
I was (and to an extent still am) a regular commenter on the SSC subreddit for years, and I've taken positions arguing from both the left and right on different subjects on numerous occasions, so I feel it's given me a sense for what the political skews among that portion of his audience base at least actually look like.
I'm torn on how much the audiences interpretation should matter in this case. As a long time reader I've always gotten the impression that Scott was someone who had left of center beliefs but who was more focused on finding the truth of issues and looking for common ground than someone who was hung up on toeing the party line. Given this framework, I think he is more concerned with liberals who view anyone who doesn't agree with the media-narrative-du-jour as a troglodyte than with the conservatives who will use this as an excuse to dunk on liberal media for being loose with the truth. Given that the vast majority of the media leans heavily left (I hope we can agree on this point), I think it makes his choice of examples justified.
I think this may be more Scott's focus, but in my time discussing politics on the SSC subreddit before that was split off into its own community, I spent about equal numbers of conversations arguing from the left and from the right on different positions, and in my experience, there was a really large and unsubtle difference in the pushback and vote scores I got depending on which side I was arguing. When I argued from the left, I would get *dramatically* more pushback, and lower vote scores, than when I argued from the right, despite the fact that I adapted to this by putting more effort into my arguments when I argued from the left.
So, I think that given that context of his audience base, being concerned for readers interpreting his writing through a lens of "of course, the left wing is obviously way more untrustworthy than the right wing" strikes me as fairly well warranted.
Your examples are fine; you have just succeeded in making a few people uncomfortable, which is exactly what should happen when you’re criticizing institutional patterns.
Viz., the guy who started this and thinks that it’s not respectable to cover the culture war (and then immediately lashes out at responses with thinly veiled ad hominems), or the guy who thinks your writing doesn’t show sufficient both-sides-ism.
The Russia Collusion story. Which you did refer to.
Most media is better interpreted as intended to entertain than inform.
Critically, this is also true of media that claims it is intended to inform. Finding something that actually *is* intended to inform is 90% of the battle.
I think it’s important to interpret these cable news channels as an appropriate mixture of the two. They very much are not doing what HBO or even NBC Must See TV or whatever is doing. They’re a lot closer to Us Weekly, which is giving you infotainment of a sort.
Well yeah. That's because most media consumers are far more interested in entertainment than information acquisition per se. The purpose of most human conversation is to buttress pre-existing intuition and signal community belonging than to genuinely exchange data. Like most social species.
If only it was the only reason...But why would anybody inform when they could as well, with the same effort, influence?
that's the sacry part :-)
It's the most important reason. Follow the money, always a good first rule of evaluating human transactions. We all gotta eat. So if you want to know why Vendor X produces Product Y, ask yourself what his consumers want, and the answer is almost always Y, even if they *say* Z. (And if they *do* say Z, you'll also find a robust industry of people who are selling "This Z is really Y if you think about it" stickers.)
I don't assume people want influence for the sake of influence per se. The genes of Ghenghis Khan are relatively dilute by now. But "influence" = "sales" and "sales" = "my mortgage gets paid and maybe I can buy a new iPhone" and I think that's what dominates the actual thinking. We are all descended from umpty generations of humans who were very successful at convincing the tribe that when the food ran short *our* contributions were very necessary so let someone else starve. That instinct is deeply wired into us.
I would have agreed 100% with you pre-covid. Now I only mostly agree. Using only "follow the money" I would have predicted the situation to be back to normal much faster even if it cynically means slightly more people dying in the retirement homes. Nope. It seems that measures with involving strong and wide money loss were taken. That politics had much more relative power to economy that I though, even in the west....Or maybe I am not subtle enough at following the money.
I now think I have under-estimated the non-monetray current in modern western world. Metoo is another thing. Where is the money flow there....
I think ideology is not dead in the nineties. It took a nap, but it is back, in other forms.
I'm not disagreeing there are other factors at work, certainly. I did say it was the *first* rule, but not the only rule :) But let me also suggest that often it can be quite challenging to follow the money, so to speak. That is, the *way* in which this or that situation can be profitable for the people encouraging it can be fairly byzantine, hard to untangle. For example, *who* is losing money because of those measures? When you examine that question, it often seems to me there's a suspicious imbalance: it tends to be the people who are not part of the decision-maker's in-group. And let us also remember that just because people *on the whole* are becoming impoverished doesn't mean any one particular group is. There's such a thing as war profiteering and short-selling -- you can become quite rich in ways that exploite the descent of others into poverty.
I'm a professional media critic. My assumption from decades of close reading of the New York Times is that if I read a statement in the Times, it's very likely true. For example, if the New York Times tells me an Asian woman named Michelle Go was shoved to her death on the subway tracks by a man named Simon Martial, I'm sure that's true.
If the Times were to tell me Simon Martial is white, I'm sure they wouldn't be lying.
On the other hand, the Times finds some other facts are not fit to print. In particular, the Times does not like to go out of its way to raise doubts in the minds of its subscribers about their general picture of who are the Good Guys and who are the Bad Guys that they've developed over their years of relying on the Times for news.
Therefore, in both Times articles I've read that mentioned that victim Michelle Go is Asian did not mention the race of perp Simon Martial.
Coulter's Law states that if the news media report on an outrageous crime but don't let you know the race of the perp, he's usually black and almost never white.
More specifically, the Times has heavily promoted the theory that violence against Asians is due to Trump saying the words "China virus" a couple of years ago. This is a popular idea among The Times' paying subscribers. An alternative hypothesis is that misbehavior by blacks (e.g., shootings and car crashes) is way up since the mostly peaceful protests of the racial reckoning.
But most subscribers do not want to hear evidence for that. To even entertain that idea would raise serious questions about who exactly are the good guys: Is the Times itself a bad guy for promoting a bad idea -- Black Lives Matterism -- that has gotten thousands of incremental blacks killed violently since 5/25/20? Most of the Times' millions of subscribers are quite content with their notions of who are the good guys and who are the bad guys (Trump and Trump supporters) that they've derived from reading the Times and might not renew their subscriptions if the Times itself were to print more facts challenging the worldview the Times has inculcated in them.
But it's even more complicated than that: many Times reporters are excellent and would prefer to report the full story. So, what I've often noticed, is a frequent compromise between the marketing needs of the Times to not trouble subscribers with unwelcome facts and with the reporters' desires to publish interesting facts. Often, if you read NYT articles all the way to the end, you'll stumble in the later paragraphs upon subversive facts that, if you think carefully about their implications, undermine the impression the headline and opening paragraphs give. Of course, most subscribers have stopped reading by that point so they never notice.
I appreciate your comment on the subversive facts hidden 3/4 of the way through an article. Do you think it’s that the editors stop reading at the 2/3 mark, so the writer knows whatever’s at the end will get through? Or is the editor letting it through, based on the “no one reads this far” approach, so they can safely give the writer what the writer wanted?
I think people who work for the New York Times are mostly really good at their jobs, so, yeah, I assume editors definitely read all the way to the end of articles they are editing.
I imagine that unwelcome headlines or topic statements could elicit emails from the Marketing department saying that focus groups make clear that this kind of thing is not pleasing to paying subscribers, or could elicit cancellation attempts from the Junior Volunteer Thought Police of low level workers/true believers.
Generally, when NYT reporters drop undermining facts into the second half of the article they don't spell out that they debunk the impression given by the first half of the article. I often wind up saying to myself when I get toward the end of an NYT article and finally read some key facts, "Oh ... so _that's_ what's going on! Now it all makes sense." But I doubt if many other people notice this pattern.
I definitely notice - I think lots of outlets do it. I used to assume it was due to cut-and-pasting from different wire services. Small papers do it too. I think the WSJ does it less often. The Atlantic starts dropping things in earlier but spends fewer inches on it.
It's not at all rare to find contrary evidence in a news story. How it typically happens is for the main purpose of the story to get stated in complete form, with supporting evidence, and then a small "Congressman Bob [from the opponent's party] said that it wasn't true," and then not offering much or any supporting evidence on that side.
We typically call that "spin" and it's definitely related to the overall topic, though not as severe. Spin has existed forever, but the outright lying (directly or through obvious omission) is either newer or more pronounced than it used to be.
There may be another very important reason to include that material, especially in a part of the article less likely to actually be read. By doing so, the Times can accurately claim that they presented evidence to the contrary and a more complete story. It's similar to printing a tiny retraction on page 10 to a false front page story. They can accurately state that they printed a retraction, even if a much smaller audience actually read it.
I'd consider CYA inserts in NYT articles to be a different class of things: e.g., "A spokesperson for the Tobacco Lobbyists Association denied everything."
I'm thinking more of where you get told something in the 14th paragraph that causes the scales to drop from your eyes: e.g., you find out the female Linux expert whose hobby is memorizing which ways putts break on every green in the World's Top 100 Golf Courses, which proves that women are just as good at 3-d cognitive visualization, used to be a man.
If they don't mention the inconvenient facts at all, they leave themselves wide open to accusations of being biased.
But if they can retort "but we did mention that, look, it's right here in the article, you just didn't read it!" you have to get into a much muddier discussion about how misleading the headline + opening paragraph are when the facts are mentioned later in the article, and whether it is or isn't reasonable to expect all readers to read the article all the way through.
I’ve seen similar fact switches in the Economist too. The article will start with a very pro market, soft economically right view that won’t challenge any executive reading it, and is basically true. Then the last few paragraphs will show the complexity and nuance, and indicate the need of a regulation or other more left wing intervention to create the best outcome. In this case I quite like it, but then I like the Economist because it likes nuance.
When I subscribed to The Economist in the early 1980s, I was wowed by those big 20 page long super-articles in the middle of the magazine on one general topic. But the short articles on US gave the impression of having been worked up by clever young Oxbridge grads who, despite a better way with words than their American counterparts, didn't really know much about America.
Agreed, the Economist is my favorite news source for very similar reasons. The number of times I have raged at a headline and then been feeling more charitable by the end of the article, as they add the "well but alsos" is very funny. I also like that they are a bit more open about their biases - there are more naked values and judgment statements in their writing than in most general world news sources. It makes it easier to spot where the just-the-facts part ends, and where the Economist-editorial-position begins.
I used to like the Economist and read it very regularly for almost 5 or 6 years until I experienced something like the Gell-man amnesia effect. On things I knew really well I started realising that their reporting was consistently wrong or ill- informed, and that made me realise I should probably value the rest of their magazine a lot less.
That is most definitely true. I'm reminded of the "pyramidal style" that we were told ages ago is the right way to write a newspaper story. Only in these strange days, a combination of rabid top-line tribalism *but also* a fact-checking ability afforded to the generic new consumer that dwarfs that available in any other age, and that severely inhibits outrageous falsehood, means you have a new style in appearance for some time now (which you already described), in which the glutinous starchy base of the article, further down, can almost contradict the sweet sugary apex at the top, which is what the tribalist subscriber base can be assumed to bite off to chew.
It's definitely a little weird. You end up getting to the end of the story and thinking "whoa! did the same guy write these last 6 paragraphs as wrote the first? No way!"
I wonder what it's like to *be* that person, though. Have they made their peace with it, to pay the mortgage? Is it enough that some minority of well-informed people read all the way to the bottom, and they know that?
I think a lot of them are thinking about the clicks, that is they write the last 6 paragraphs, and then decide what angle they are going to take to get clicks on the article.
I suspect they are operating in a world where "everybody knows" that the reality is in the last half of the article and the headline, lede and first paragraph are really just advertising for the article and - because they are advertising - can say anything that isn't literally untrue.
Sure, they *have* to be thinking about the clicks. The Internet has done its usual job of savage disintermediation, and you can no longer live an ideal journalist's life swaddled in the bowels of some enormous corporation that earns big bucks from the classified section and therefore can indefinitely indulge your wish to spend a working life reporting on true things that most people find dull or mildly offensive. The field is contracting, and everyone's got to be his own brand now, an entrepreneur, selling the product first and delivering news second.
I came across a series of Youtube videos recently that were an interesting illustration of the problem. They're by a young (by my standards so early 30s) American woman who *looks* clearly American -- tall, blonde, round-eyed -- but who speaks very good Chinese and Japanese from having studied and lived in the Far East for almost 10 years. She made a bunch of videos in which Chinese or Japanese are startled when the American blondie can understand and speak to them in their native language -- great fun, and they attracted a huge number of clicks. Which led her to think she could make a living making "an American in Japan/China" videos, but then she found out that when she made videos delving into the nuances of cultural adaption her audience was like meh -- hey, do more of that thing where the waiter gasps when you order chow-fun noodles in perfect Mandarin! Those are a hoot! She's smart enough to realize she needs to market herself first, because the bills have to be paid, but she also wants to not be stuck in the functional equivalent of funny cat videos shtick, and she's wondering how to square that circle.
I'm always a little bemused that there are so many people who are shocked and surprised that this state of affairs exists, though. ("Journalists today! My God, all they think about is pandering to their audience!" "Scientists! All they think about is appealing to granting agencies and the peers who will review their next grant application!" "CEOs and other corner-office cowards! All they think about is how to appeal to this or that customer demographic of which there may be millions but which I personally find regrettable!") As if any of these things is weird and unnatural, instead of how humans have operated since Ramses II assumed high office.
I tend to attribute it to the fact that so many modern people have spent big chunks of their life embedded in vast organizations, like the Times' reporter in the Times's heyday -- in school, or working for enormous corporations. Sort of a modern feudalism. Having less experience of what it's like to *be* an entrepreneur, or work for a small business, or work in sales, where Always Be Closing is Job #1 and you better never ever forget that, lest you have to borrow from your mom to pay this month's electric bill, they seem strangely unaware of this gritty reality experienced by legions of their fellow citizens. (And contrariwise, the people who are living the life of constant personal brand-building are finding the viewpoint of those embedded in giant orgs also bafflingly alien.)
> This is a popular idea among The Times' paying subscribers
Do you know this? Paying subscribers often are at odds with what's going on in the article. The comments seem to be calling out the woke stuff more and more, like the recent one about youth transition (see https://www.blockedandreported.org/p/premium-is-the-conversation-on-youth).
Supposedly, reporters used to start articles with the five W's pertaining to the subject matter - i.e., Who, What, When, Where, Why. The NYT's current style, however, is to lead with just one W - What you are supposed to conclude. I especially like how they tell you this by cramming in faux context with "amid . . .," and faux causation with "following . . ." And then, in case you are really dense, an "experts say . . .," to hit you over the head with the message.
So the article will read something like: "An Asian person was assaulted yesterday. The assault occurred amid a rising tide of media reports of anti-Asian hate crimes following Donald Trump's use of the racist, xenophobic phrase 'China virus.' Experts say that such hateful comments trigger anti-Asian racial hostility and violence by white supremacists . . . . ." Blah, blah, blah for 44 paragraphs, then at Paragraph 45: "According to police reports the perpetrator was a homeless man named Deshawn Abdullah Jackson who has mental health issues and a history of making assaults in the area."
> The world ended this morning. Here's why that's a bad thing.
I could never have articulated it myself, but your example is a perfect copy of a lot of writing I've consumed over the past few years.
You're also assuming a false dichotomy here. It isn't inherently incongruent to be anti-police, think the BLM protests were good, dislike "wokeism", recognize complex unintended consequences and other biases- if you have an "either or" prism of looking at these things, that affects the way you will read bias.
I would like Lincoln more if he were friends with Marx. It would show he considered different opinions to his own and was humble enough to discuss ideas he disagreed with.
https://nakedemperor.substack.com/
McGoohan was a really interesting guy!
I disagree strongly with the characterization of the Swedish study. The study really did focus on immigration status as the most prominent result of the analysis.
In particular, Scott claims that, according to the linked article, immigration status was not "a particular focus of their study" and that "although it wasn't a headline in their results, you could use their study to determine that immigrants were responsible for a disproportionately high amount of rape in Sweden."
I went and looked up the original article and skimmed it. Here is the first paragraph of their results section:
"Results
Descriptive data
Between the years 2000 and 2015, a total of 3 039 offenders were convicted of rape+ against a woman (Table 1). The majority of the offenders were men (n = 3 029; 99.7%) and the mean year of birth was 1976 (SD 12.3). Close to half of the offenders were born outside of Sweden (n = 1 451; 47.7%) followed by Swedish born offenders with Swedish born parents (n = 1 239; 40.8%). A relatively small part of the cohort was constituted of offenders being born in Sweden with at least one parent being born outside Sweden (n = 349; 11.5%). Table 2 shows from which regions the first- and second-generation immigrants and their parents originate from. Among Swedish born offenders with one parent born outside of Sweden (n = 172), the foreign-born parent was mostly born in Western Countries (72.7%) followed by Eastern Europe (11.0%). Regarding Swedish born offenders with no parent born in Sweden (n = 177), a high proportion of the mothers and fathers were born in Western countries (40.7% and 33.9%) followed by the Middle East/North Africa (19.8% and 24.0%). The largest group of the study population was found among offenders born outside of Sweden (n = 1 451); a significant part was from the Middle East/North Africa (34.5%) followed by Africa (19.1%)."
I think this is the definition of making something a headline of one's results. One of the most prominent pieces of information in the results is the breakdown of cases by immigration status. It specifically says that more offenders were born outside of Sweden than born in Sweden to Swedish parents.
It looks like this mischaracterization was not present in the news article that Scott linked, which discusses this research paper. In that news article, they specify (with quotes from the authors) that the original purpose of the research was not to focus on immigration status, but that it was something they discovered by chance while doing the research. In particular, the claim that immigration status wasn't a headline of their results seems to have been introduced by Scott.
I don't know whether Scott had access to the original research paper - I couldn't find a freely available copy of it. However, this same highlighting that I quoted above is also present in the abstract of the paper, which is freely available. Here's the relevant content from the abstract:
"A total of 3 039 offenders were included in the analysis. A majority of them were immigrants (n = 1 800; 59.3%) of which a majority (n = 1 451; 47.8%) were born outside of Sweden."
The abstract is freely available here: https://lup.lub.lu.se/search/publication/e2c65632-50e1-4741-a1b3-21664eaf7724
I don't disagree with Scott's overall point, which is that the researchers face repercussions for their findings, repercussions that they likely would not have faced had they found the inverse conclusions. But I strongly disagree with the implication that one would have to go out of one's way to use their study to determine that immigrants were convicted of rape at a disproportionate rate. It makes it sound like the scientific establishment is raking through papers only tangentially related to this topic to find people to crush, and that's just not what happened.
In a way, finding this small but important inaccuracy in this essay drives home the overall point of this essay. It's necessary to distrust every source, to the extent that they're willing to stretch things or not double check things or generally be unreliable. And that applies to this essay as well.
Nope, they investigated a lot of things, and it turns out one of the largest factors in rape offending was immigration status, literally a majority of offenders were immigrants, so it gets reported on first. What's the alternative, list everything in alphabetical order? The convention used here is extremely common in scientific papers. Scott is right, it wasn't a focus of the study. The study wasn't "Do immigrants commit more rape?".
Look at this abstract (I excluded the last sentence discussing the results). If you didn't know the results of the study, would you call this a study focusing on immigrants? Of course not!
Abstract
Sweden has witnessed an increase in the rates of sexual crimes including rape. Knowledge of who the offenders of these crimes are is therefore of importance for prevention. We aimed to study characteristics of individuals convicted of rape, aggravated rape, attempted rape or attempted aggravated rape (abbreviated rape+), against a woman ≥18 years of age, in Sweden. By using information from the Swedish Crime Register, offenders between 15 and 60 years old convicted of rape+ between 2000 and 2015 were included. Information on substance use disorders, previous criminality and psychiatric disorders were retrieved from Swedish population-based registers, and Latent Class Analysis (LCA) was used to identify classes of rape+ offenders. A total of 3 039 offenders were included in the analysis. A majority of them were immigrants (n = 1 800; 59.3%) of which a majority (n = 1 451; 47.8%) were born outside of Sweden. The LCA identified two classes: Class A — Low Offending Class (LOC), and Class B — High Offending Class (HOC). While offenders in the LOC had low rates of previous criminality, psychiatric disorders and substance use disorders, those included in the HOC, had high rates of previous criminality, psychiatric disorders and substance use disorders. While HOC may be composed by more “traditional” criminals probably known by the police, the LOC may represent individuals not previously known by the police.
One does not write the abstract before doing the research.
Just because something is a major finding and therefore a focus of the abstract does not mean it was an intentional focus of the research.
Umm ... yes they do. I ran into a fellow at a research station, who was studying some bug in the desert to prove climate change. He had his paper existing in his head before he started his data collection. Just look at the number of students who have a degree in global warming.
Obviously you know roughly what you are going to do before you conduct the study, but you don't write the abstract. You do that when you are actually writing the manuscript after the study is complete. For the last paper I wrote, the abstract was _literally_ the last thing we wrote before submission.
Source: published scientist with lots of other published scientist friends.
The paper "existing in his head", is not the same thing as the paper, or the abstract, being *written*. Yeah, the guy probably had a vague concept of how the paper was going to flow and what the conclusions were likely to be, but I doubt he had more than three words strung together in his head.
And if he did, memory is *extremely* mutable. By the time he'd finished analyzing the data, he'd have an abstract "existing in his head" that's a good match for the data, vaguely similar to the abstract that existed in his head at the start, and he'd believe the two were nigh-identical.
Dangerously Unstable is right. Almost nobody writes the abstract before they've completed the research. And if the abstract isn't literally the last thing written, it's because the submission deadline for the abstract comes well before the submission deadline for the paper - it takes effort to write something brief and accurate, so writing the paper serves as a rehearsal for writing the abstract.
Also a published scientist with lots of other published scientist colleagues (and just reviewed one of their abstracts this morning, written first because of submission deadline).
As someone who is a journalist and a fairly close follower of Swedish debates on crime and immigration (I lived there for seven years, for three of them as a working class immigrant in what is now a ghetto but then wasn't), I think Scott is 90% right, but missing one important journalistic skill, which is that we know which experts to trust, and how much to calibrate in each case what you might call the Pravda factor.
You have to remember that no expert or insider will tell the whole unvarnished truth in public except in very rare cases. This is normal and natural. Either they will be misunderstood, usually deliberately and often by their own side, or they will be ignored.
But it you're lucky, and have something to trade, and if they have learned that they can trust you, they will talk much more honestly in private. Given that the Swedish debate about immigration and crime is so inflamed, and the public story so very different from the things people assume in real life, the first thing I'd do is ring up a criminologist friend and ask if this story is bullshit. That would be off the record and it would have to be. Unless they felt there was a huge injustice going on, taking sides publicly would be as pointless as joining in a twitter spat.
What they told me would feed into what I then wrote. But now we're up into double layers of trust. The reader has to trust that I have a trustworthy source. Why should they? Readers don't on the whole interact with individual bylines enough to establish a relationship of mutual trust. So Scott's original heuristic is about right.
But it does lead to a genuinely damaging situation in which (to speak from experience) a Guardian executive will say "We can't use that quote because the Mail would love it". And, presumably, vice versa.
"one important journalistic skill, which is that we know which experts to trust, and how much to calibrate in each case what you might call the Pravda factor."
Do you though? I doubt this assertion and think that it is closer to confirmation bias than a skill.
I mean Tetlock has found the experts that do go on the news to be worse than those who do not.
If you understand Norwegian this show talks to a bunch of journalists and gets the point that they choose the ability to easily convey their meaning over knowledge in the subject matter:
https://tv.nrk.no/serie/folkeopplysningen
In other words, journalists optimize for a good interview, not an accurate interview. In addition, they like the ones that are known and will use people who they know say yes and are reliable over finding an actual expert in the field.
I can't have made myself clear; I'm sorry. A public interview is never just, or mostly an exchange of information. It's a performance for an audience. It is always choreographed and usually edited. But the conversations which inform us most are those which are held just with the source, with no readers or listeners overhearing. And it is in the nature of honest speech between two knowledgeable people that often as much is conveyed by what's not said as what is. That aspect is obviously impossible to convey in public.
(I do as it happens understand Norwegian reasonably well, though Danish is impenetrable to me.)
If one is interviewing scientists or academics, *of course* we optimise for the effect on the audience. That is because ultimately the audience or the readership are our paymasters. It is no use to anyone if you interview someone who cannot make the truth comprehensible to the audience. Translating the natural speech of an expert into the natural speech of an ignoramus is the core arbitrage performed by any specialist journalist. This is much easier in print than on radio and hardest of all on live television. So, yes, there is a natural bias towards fluency at the expense of thoughtful understanding. The point I was trying to make is that good journalists are aware of this, and make allowances.
To give a concrete example: I have interviewed both W.D. Hamilton and Richard Dawkins. There is no question which was the greater scientist, but if you're looking for someone quotable it's Dawkins every time.
Some lines the media already crosses are pretty far out there. No idea about FOX, but I do know that France 24 or TV5 Monde can quote a foreign figure saying X and "translate" that as saying the opposite of X. Or show footage of NATO tanks and imply those are from a non-NATO country. Source: a relative who watches France 24 and TV5 Monde and knows both languages.
Yes, unfortunately my prior on "formerly respected news outlet will simply make up basic facts" has drastically increased in the past few years. I've caught the BBC flatly lying about basic facts in the form Scott claims FOX would never do, several times now. A few examples:
1. Some friends sent me a TV segment about COVID as a way to "prove" I was wrong about something. The segment had an interview with a woman who was introduced as a "Dental specialist", the idea being that the NHS was so overwhelmed it was having to recruit people from other medical fields to serve on COVID wards. The women didn't sound anywhere near confident enough to be an expert on anything, but fortunately had a rather unique name, so I quickly Googled her. She's actually a social worker who tries to get prostitutes and drug addicts to go to the dentist.
So - the BBC will lie about facts like what jobs people have.
2. They wrote an article about a vote in Switzerland, again related to COVID measures. It presented an extremely amateurish hand-drawn cartoon image and claimed this was "an ad by the Swiss yes campaign". But there wasn't any yes campaign in this case, so I reverse image searched the picture and found it came from an article about one man who rented a couple of billboards for a couple of hours to troll some protesters, because he was upset there wasn't a yes campaign. The image looked amateur and hand drawn because it was.
The same article had a graph of COVID cases with weird drops and spikes in it for Switzerland, but no other country. The caption of the image claimed this was due to delays and data errors by the Swiss government. If you're thinking that doesn't sound very Swiss, you're right. I checked and the data errors were introduced by the BBC, the Swiss COVID dashboard didn't have them.
So - the BBC will lie about things like the existence of entire political campaigns, and even graphs of government statistics cannot be trusted.
3. The BBC likes doing vox pops with people introduced as "nurse", "doctor", "teacher", "professor of X" etc. At some point it was noticed that these people would attack the government and nasty Tory party far more often than you'd expect given the voting habits of the general population. A website called Guido Fawkes started checking the background of these people and discovered that staggeringly often, they were Labour Party activists and this wasn't disclosed anywhere. At one point a COVID related Panorama special was broadcast in which every single "expert" turned out to have engaged in public left wing activism, and some were actually attending/speaking at Labour Party rallies:
https://order-order.com/2020/04/28/panoramas-ppe-investigation-party-political-broadcast/
So - the BBC is willing to present people as neutral experts when they're actually party political activists, and not tell anyone that.
There are many other examples that could be listed here but unfortunately I've now learned that actually, TV journalists WILL lie about basic facts like numbers, dates, job titles, events in foreign countries. The issue is not one of mere bias or selective presentation of facts, but that even the most basic claims about the most objective things cannot be taken at face value.
From the other side, some viewers of the BBC's Question Time spotted that a particularly distinctive member of the audience kept appearing in different instances of the programme held in different places. It turned out that the programme managers were encouraging people from more right-wing groups to attend the broadcasts in order to counter what they perceived as a liberal bias amont the people who applied for places in the audience, and to create more on-screen argument.
Neither practice is admirable, but I mention this one to counter the notiion that the bias is all in one political direction.
That, or the BBC has a very strong idea of who they want to be their "controlled opposition".
You want a combination of people who either say "Well I usually support the Tories but I oppose this thing the Tories are doing" or else people who are complete and obvious nutters and will say something stupid.
I think you're reading too much into this. The BBC comes under significant criticism from the left as well as from the right. The current director-general of the BBC, Tim Davie, used to be a conservative councillor and was deputy chair of his local constituency party. It's a large organisation with many groups and sub-groups, and it also funds some of its content from external organisations. It's highly unlikely that there is a controlled and deliberate disinformation campaign across the whole organisation.
This would be a lot more convincing as an argument if the media didn’t frequently make up events that never happened or lie about them in hugely significant ways. You e covered many such events in the past in your blog as well, which makes me wonder if those all somehow fall into lies no smart person was expected to believe, even if those lies launched wars where millions of people died.
Recently Rolling Stone Magazine made up a story about ivermectin poisoning cases causing gun shot victims to be unable to get into hospitals. This was picked up and repeated widely in the left media despite having no basis in reality.
Not to mention the clear lie about horse dewormer which is utter nonsense concerning a drug on the WHO essential medicines list whose finder won a Nobel prize for its discovery and is an approved drug for humans in every single developed country in the world….are you telling me they didn’t know that they were lying as they cashed those Pfizer advertising checks and listened to their board of directors, some of whom also sit on Pfizer’s board? Is this outright and intentional lie equivalent to making up false footage of a mass shooting …is this not the sort of totally made up nonsense reporting you’re talking about them not doing! Because they are doing it anytime they feel like it.
And what of the evidence free Russiagate story while they ignore what google and Facebook have done to interfere in the election?
What about something simple like how Rodney King’s name isn’t Rodney King? They couldn’t get his name right in the reporting and stick with their error.
These are higher profile cases and lend themselves towards controversy, but even with various other stories reality gets twisted to such a degree thst it may as well be made up. Anyone I’ve met who has been part of a news story has said that the reporting was a lie and a misrepresentation of what went on.
Science reporting is a favourite point of malfeasance and making up nonsense to pretend s study says the opposite of what it actually says. Often with the scientist telling the reporter over and over again that they’re wrong. There is no world in which those science journalists don’t know the truth and limits of the study when they’ve spoke to the corresponding author, yet they’ll just make up lies as they see fit and they do it on purpose.
Is this 100% exactly the same thing as making up a false citation to then say whatever they wanted to say? No, it isn’t, but I fail to see a difference. If you can say what we you want, then the base reality event is just random cannon fodder for the lie machine, even if you can squint at the gruesome chunks of flesh and occasionally make a guess at reality. If you want to say up is down, it isn’t hard to find some loosely related ‘up type’ event in reality you can twist.
Your own experience with the NYT is proof enough of s low stakes case where pigheaded reporter and editors simply wish for whatever reality they want snd do whatever they want with information.
Does it matter if half the media lie and the others do not? Is it somehow better if both fox and msnbc go along with Bush and Powell about nonsense yellow cake and connections to 9.11?
Was a reporter not caught on a hot mic talking about how she has the Epstein sroey years ahead of time and had to suppress the story because of management who didn’t want to get cut out from the royal baby wedding coverage? Or how they all lie about his ‘suicide’ that was clearly not a suicide? We just nod along and go yes yes..we smart folk know he was obviously some intelligence agent taken out by his handlers.
In any given year many false, made up, and non factual stories run and they’ll be a mix of them doing these things on purpose, simply picking up propaganda and running with it, and the mightiest tool of censorship being non coverage of stories to devalue them. Along with early reports, rushing, and just plain being wrong.
But they definitely make up things out of thin air when base reality fails to provide them an excuse they can use to say something else entirely,
"Not to mention the clear lie about horse dewormer which is utter nonsense..."
Well, it is a horse dewormer. It's not JUST a horse dewormer, but it is a horse dewormer. This is exactly what Scott means by being technically right and deceptive at the same time. CNN didn't say, for example, that ivermectin was used to kill people in gas chambers, which would be an outright lie.
Let's push the line a little more: what about the reporting that Joe Rogan and others who were *prescribed* Ivermectin took horse dewormer. How far into "deceptive" can we go before we can call it a lie?
Your example lie is clear, but it's very far from the line. I think there are legitimate examples like this which cross the line into lies.
I take the “horse dewormer” stuff to be the same sort of statement as “our competitor fills their produce with artificial chemicals while our stuff is all natural”, where it’s clear that what they are saying is likely 100% technically true but completely beside the point and just designed to make the other guy look bad.
I don't think it's the same, and the difference is because "artificial chemicals" and "natural" aren't strictly defined terms, they are vaguely defined categories that the reader is supposed to interpret.
By contrast, "horse dewormer" is literally a class of product you can buy, therefore claiming someone is ingesting horse dewormer is clearly asserting that they are ingesting medication *intended* for horses, and not one for human consumption.
I think all of these are pretty straightforwardly defined terms, and the issue is people using them for the wide penumbra of "technically correct" uses rather than the paradigmatic uses the term is intended for.
Agree to disagree then, because I don't think those terms are straightforward to define in a technically correct sense.
I also disagree that the statement "Joe Rogan took horse dewormer" is technically correct. "Ivermectin" is neither denotationally nor connotationally equal to "horse dewormer", so in what exact sense can you claim that that statement is equal to the actually true statement "Joe Rogan took ivermectin"?
It's definitely a strawman a lot of the time, but it's also born out of the fact that in rural states like mine, there are folks going down to the feed and seed and grabbing literal ivermectin packaged as horse dewormer. I think there's a difference between believing Ivermectin can help and searching out a doc that will prescribe a regimen of the formulation designed for people vs going out and grabbing veteranary medicine. There was a hilarious thread in a local fb group the other day, where some dude and a few of his adherents were loudly claiming something like: "Vaccines are stupid when you can go get ivermectin at the feed store and most I know are cured in 24 hrs"
To me the usefulness of "horse dewormer" is that as soon as someone says it, they are clearly outing themselves as having a particular political bias which is then helpful to put everything else they say into context.
If one is a scientist or doctor who has concerns about the use of ivermectin for Covid, then there should be no need to resort to "horse dewormer" to make a case. If "horse dewormer" is the best someone's got, then they don't actually have a case to make.
I have no position on the use of ivermectin pro or con, but I really don't like being emotionally manipulated.
I see what you are doing here. You are claiming it is not a horse dewormer to give an example of the kinds of lie that the media often does. Genius!!! \s
I wasn't able to Google "Rodney King’s name isn’t Rodney King", what do you mean?
I think OP is also indulging in a bit of fact-twisting for effect here, intentional or not. Rodney Glen King was known as Glen by his friends, that's all. Not a particularly remarkable 'error', if it even rises to that standard.
https://www.latimes.com/opinion/story/2021-03-03/rodney-king-beating-30-anniversary
This is an interesting blog post. It also has theoretical potential, when it comes to elaborate and fine-tune a theory of human interaction as such.
You are essentially at the intersection between signalling theory and semiotics. Which in my humble opinion is “where the action is” in the human sciences today (and probably in the life sciences more generally).
What you are describing is the way principals (defined as actors in a coarser information position, in this case: those who read and watch the media) try to screen messages and signals from agents (defined as actors in a finer information position, in this case: journalists and editors belonging to different news outlets) to detect which agents are trustworthy/who to trust.
... Journalists & editors send messages and signals in order to come across as trustworthy. Users of media (the rest of us) try to screen these signals and messages in order to determine who we can trust, and who not to; including when we can trust messages sent by those we normally do not trust, and when to be sceptical toward messages by those we normally trust.
One of the (interesting) points in you blog post is that some principals are better at screening such messages and signals than others, including that those who are less good may (rationally) adopt cruder strategies in lieu of fine-tuned screening abilities, such as “trust nothing from news source X”, or “trust nothing except from your close circle of friends and relatives”. And then you try to suggest some kind of demarcation criterion to use, to improve your screening skills. Again, theoretically interesting – and of applied interest as well!
…you might also have the embryo here, of a strategy of how one might potentially establish what a commenter to a previous blog post labelled “the inner party”; i.e. how to solve the very difficult problem of creating a circle of people who are able to subtly signal what to believe and what not to believe to each other, while at the same time being able to collectively maintain signalling something different to the “masses” (the great unwashed). Re: you story about good versus glorious harvests in good-old USSR-time Pravda. Hmmm…
…Some classic essays and articles come to mind here: “Trust in signs” by Bacharach and Gambetta; “Trust as a commodity” by Dasgupta; “Strategic interaction” by Goffman.
Elaborating this type of insights into fine-tuning a general theory of signalling & semiotics, I suggest the following one-liner as the overarching premise for this general theory: “We are all principals when observing others, and we are all agents in the eyes of others”. Meaning that “we are all in a coarser information position when observing others, and we are all in a finer information position when being observed by others….”
Good in general but totally fucked about the 2020 election because the WaPo and similar sources were not in a position to KNOW whether there had been well-covered-up fraud AND THEY DIDN’T WANT TO KNOW.
No need to get into the details here to try to persuade you about fraud in the 2020 election, just telling you that it is a really bad illustrative example. Furthermore this IS the kind of thing they would lie about for the same reasons they spiked the Hunter Biden stories they knew were probably true.
On the other hand, the 2020 presidential election seems to have been absolutely fair in the sense that everybody had the same opportunity to cheat...
Seen from across the pond, US election integrity measures displays a desperate need to be brought out of the 18th century.
I do NOT want to get into this now, but know that I have been working professionally as a consultant in the field of elections since 2002 and I have seen several elections stolen in precisely the way this one appears that it may have been stolen (with some additional twists related to it needing to be done in 5 or 6 states at the same time and a whole lot of high-level maneuvering related to blocking scrutiny). Although I won’t go into detail, it IS certain that illegal destruction of evidence that would allow a definitive answer to the question has occurred (which is not to say whether enough evidence remains to eventually arrive at a definitive answer). You are (1) correct about the insecurity of the system (2) perhaps less than fully aware that the insecurity is built in on purpose because it functions, like gerrymandering, more as a bipartisan incumbent-protection scheme than for partisan advantage (3) wrong in this case about the parties having equal opportunity, this required a kind of coordination that was only feasible because of extremely good media control (along with the existence of key Republicans in 2 states and 1 news network who preferred that their party lose than that Trump win).
Where can we learn more about your take on what happened in the 2020 election?
> and I have seen several elections stolen in precisely the way this one appears that it may have been stolen
What way would that be?
Note: this only works when carried out by insiders in cities where one party is dominant so everyone around in any official capacity is sympathetic:
1) change rules to have mail-in ballots for as many people as possible
2) create fake voter registrations and addresses so ballots will be mailed there and can be filled out and sent back
3) on election night refuse to scrutinize signatures, pass everything through, while hindering observers from the other party and locking them out of some rooms where you are counting
4) if late on election might it appears that you didn’t steal enough votes this way, “find” more by introducing stacks of identically marked ballots with no chain of custody documentation to run through the machines
5) cover your tracks by combining groups of ballots that are supposed to be kept separate and other “oopsies” no one is prosecuted for, deleting logs and other data from machines, etc.
> create fake voter registrations and addresses so ballots will be mailed there and can be filled out and sent back
Okay, those lists are public. I get that local Joe might not have the money or capacity to do that, but professional GOP poll-watchers have no shortage of demographic information to compare against to find fake names and addresses.
Did they find those?
Of course they did. They did not have the resources to look at the entire state databases, but the studies that did a random sampling of them found that a few percent could not be verified: the issue, of course, is that there is no way of telling which candidate those unverifiable registrations actually “voted” for.
The irony of this comment being made on a post about people who fail to figure out when they're being played for a sucker and so end up stanning conspiracy theories...
I am trying to figure out a way to ask this without sounding insulting, but - this was on purpose, a troll, right; you can't really have this little self awareness, can you?
I didn’t say it has been proven that that election was stolen. I was *careful to say* that it had not been proven that it hadn’t been stolen and that there was not justification for Scott’s talking as if that had been proven that it hasn’t been stolen. Furthermore, as a professional in the field, I was very familiar with all the evidential issues on both sides, and was only saying don’t jump to conclusions because there has been a lot that has been covered up and not yet fully examined. I’m very suspicious based on professional experience, but I don’t want to hijack the discussion on this unrelated issue, which can’t be conclusively settles right now anyway, so I just criticized Scott’s casual assumption that the matter has been settled (and also that the boundary of what mainstream media would fail to report properly excluded this).
What's your perspective on the $300 million Zuckerberg donation? Was it a good-faith effort to help counties genuinely in need during the election? Or was it a play to help 'count the vote' from a known partisan funneled almost exclusively into known swing districts with the aim of influencing the outcome?
Both! For public consumption the former, but because of the actual identities and agendas and lack of principled neutrality of the specific individuals who were tasked with spending this money, it worked strongly in one political direction.
I'm not close to this, so maybe I'm wrong, but it's strange to me that one of the richest people on the planet giving hundreds of millions of dollars to help 'count the vote' right before the election isn't one of the biggest news stories ever. Especially when that billionaire has known political biases.
Not to mention the cast of billionaires in the richest companies in the world all simultaneously deciding (without collusion?) what election-related discussions are allowed across the most popular platforms on the internet.
This reminds me of banana republics that have 'free and fair elections', except that the challenger has to do his campaigning from prison and the incumbent must be trusted to count the votes and report accurately. The forms are all there, but something seems a bit off.
And the biggest red flag is that I'm not allowed to talk about it or I get banned from everywhere. Claims of a stolen election are a time-honored tradition in American politics, going back centuries up through the 2016 election. I didn't question the general validity of this election until I was told I'm not allowed to talk about it. That's when I started to wonder whether something might be going on here.
Yeah, I'm not a "election truther" or anything of the sort, but the implication of the section seemed to be when WaPo tells you there's "no evidence" (!) of election fraud, that should be as convincing to you as when Fox News tells you there's been a shooting. The latter is a positive statement about a simple reality, while the former is a negative statement about a much more complicated system.
I don't know that it hurt the overall article's effectiveness, but I think that example was a swing-and-a-miss.
The "no evidence" in the 2020 election fraud claims is in the category of "no evidence even though everyone was looking really really hard." Republicans found things that they thought were evidence of fraud. They took that stuff to court, often in front of Republican-appointed judges or in states where the Secretary of State was a Republican. The judges, almost unanimously, said "no, this doesn't look like fraud." To me, that's sufficient evidence for a newspaper to report "no evidence of large-scale fraud was found in the election."
You can't just ask yourself "do I think this source has a motive to lie?" and stop there - with enough motivated reasoning, you can always think of a motive for lying on any subject. You have to ask "do I think this source has a motive to lie, to lie about the existence of whatever evidence they cite to support the lie, and accept the risk of being confronted by adversaries who want to catch them in a lie?"
If someone believes the moon landing was faked, you should ask why the Soviets (who can easily track NASA's launches) didn't reveal the fraud. If you believe that there was election fraud in 2020, you should ask why the party in power was unable to get any of their claims to stick in court. You should ask why all of the people claiming "obvious, incontrovertible fraud" when talking to reporters suddenly retracted down to "well, you can't prove they *didn't* sneak extra ballots in there" when the time came to actually make legal claims.
When I see them say "no evidence" I trust them less because it's overstating the case and trying to snow me.
There are a smattering of small things, which if all taken together in one place wouldn't be enough to flip even one state.
So that's "minimal evidence."
. . . And looking back at the headline, they say there wasn't "significant fraud." The phrase "no evidence" didn't appear at all. God damn it, I wasted my time arguing against something that didn't exist.
Sorry, I didn't mean that to be a quote from the article. But that said, I think this is splitting hairs. There was "minimal evidence" the way that there is "minimal evidence" that homeopathy works, or that psychic powers exist. That is to say, there was evidence that could be convincing at a glance (especially if you were motivated to believe it), but when looking more closely turned out not to be anything meaningful. There were things like "a whole bunch of D ballots all got counted at once in some states" which turned out to be an artifact of the counting process, for instance.
There were also a few people who got caught trying to vote twice, which happens every election, which I suppose counts as "election fraud" but has basically no connection to the Big Lie.
"That is to say, there was evidence that could be convincing at a glance (especially if you were motivated to believe it), but when looking more closely turned out not to be anything meaningful."
This is one of the fact-checkers' favorite Newspeak tropes. Translation: "Yes, your most basic, fundamental senses may lead you to believe something, but let me put some context on it for you. Let me provide *nuance*."
"You may see the fires and hear the explosions with your own eyes and ears, but let me tell you why that's a peaceful protest."
It's a form of gaslighting. They want you to doubt even your most basic senses and rationality and just defer to them for the Truth.
But is it your opinion that first impressions are always correct, that nothing is nuanced, and that context never matters? If not, there must be some cases where the "Newspeak trope" is correct, no?
Really, that's what you're going with? "My first impressions are always right, anyone who disagrees is just gaslighting me"?
Remember that "too good to check" story, when a doctor said hospitals were overwhelmed with people taking ivermectin and making themselves sick? Was Scott gaslighting us when he gave us several pages of investigation into how that claim came about, why a journalist might have thought it was plausible but it turned out to be false?
This is my explanation for why Zeynep Tufekci, a sociologist who studied the role of social media in real-world phenomenon, turned into one of the most prescient COVID pundits from the very start: she has no microbiology background, but she has a finely-tuned sense of who is playing politics with the truth, and which ideas are being brushed aside for reasons besides validity.
Huh, interesting point!
Indeed, reading that section of your article (noticing “good rather than glorious”) had me wondering if it was a direct allusion to Dr. Tufekci, and her own professed skill in that department (as described in the [edit: article below])
Similarly, my mind went directly to [https://www.theinsight.org/p/critical-thinking-isnt-just-a-process] where Tufecki describes the "authoritarian muscle memory" involved in "reading between the lines of official statements."
Ah thank you for posting. *that* is exactly the article I was thinking of, not the one I actually posted. I vaguely recalled Tufekci writing about this topic well but misremembered in which article.
The children of narcissists and addicts also get very early training in bullshit detecting (I speak from experience, also as a therapist). It's kind of like learning a language naturally -- it's not the same if you have to learn it later running it through more conscious channels. (I don't know anything about Tufeckci's childhood, so don't intend any commentary there). Maybe there's a genetic piece too, like supertasters.
Growing up in a developing country like Turkey, as she did, could also expose one to significant amounts of the Pravda effect.
Oh good point. Those of us who grew up in the US are a naive lot comparatively speaking. Not for very much longer maybe.
News media outlets have a lot of discretion over what is news and what is not news. Obviously, wars, stock market crashes, blizzards, etc. are going to make the newspaper. But in a country of 330,000,000 people there is always more potential news to report upon than there is space for it, so judgments must be made.
For example, the New York Times, which traditionally strongly influences the rest of the news media, finds the rather dusty story of Emmett Till, a black youth who was murdered in 1955 by whites, to be worthy of constant coverage. The name "Emmett Till" was mentioned in the NYT in 57 different articles in the last 52 weeks, and in 407 articles since 2013.
The once-a-week invocation of Emmett Till serves the Times' purpose of encouraging readers to believe the Narrative that blacks are in grave danger of being murdered by whites. Granted, somebody with good critical thinking skills might notice that if you have to keep bringing up a 67 year old incident to serve as an example of your statistical hypothesis, you might not actually have a strong case. But most New York Times readers are more in tune with the mood music than with the data.
In contrast, the New York Times does not much at all like to report on black-on-white violence, treating it as distasteful police blotter items of only local interest. Not surprisingly, readers of the national news thus tend to get a highly lopsided and biased view of the criminal justice system, with disastrous consequences, such as the historic increases in murders and traffic deaths since the declaration of the racial reckoning two years ago.
I'm going to not ban you for this because honestly I started the talking about the way the media reports race and crime, but maybe limit yourself to doing this kind of thing once per comment section?
Posting Sailbait and then threatening to ban Steve for commenting more than once, that's cold. Is a man not entitled to an outlet besides his own 23,529 blogposts?
If after instituting this rule you mention black women's hair issues in a blogpost you'll probably kill the man.
If you want to give Steve a taste of his own medicine why don't you post a "much more than you wanted to know"-comment on, I don't know, an epidemiological CDC data-mystery? He won't know what hit him!
https://www.unz.com/isteve/why-are-there-22-more-black-homicide-victims-in-summer-than-winter/
lol
While this is certainly your blog, and you can ban whoever you want, his point here seems incredibly relevant to your post. I had no idea the Times mentioned Emmett Till so often, and it does create a perception in the minds of readers. Being constantly reminded of an egregious injustice cannot be anything but designed to create that impression. Similarly, someone following Trump's Twitter could expect to see a lot about how bad Trump's political enemies are. If we were following Trump, we should acknowledge and adjust for that bias. Noticing the bias seems like an integral part of the process, and a rationalist should absolutely go out of their way to help recognize these biases in the news we all consume.
Someone needs to create a twitter-bot that automatically Sailer-posts the following when one of those BLM-buzznames are used:
"When white policeman Kyle Rittenhouse shot George Floyd and Emmett Till on January 6th, 1619, it was not just the white supremacist murder of two more black men, it was a lynching of *all* black bodies, which built our democracy, but were redlined out of generational home equity."
Shit like that can stay on twitter, thanks.
I have no idea about the context or racial issues at play here, but I think the broader point is *incredibly* germane - experts and the media do not have to make grave and deliberate errors of commission to push past the boundaries of your trust. They can make choices about what to say, how often to say it, and what not to say and how much they should not say it.
I read neither The Federalist nor MSNBC, because they both make outrageous choices about what to cover. I don't know that I've ever seen The Federalist lie, but they may as well by how slanted their coverage is. MSNBC I have seen straight out lie (or say things they really should know to be lies) as well as being deliberately one-sided to a ridiculous extent.
Knowing that the NYT reports on nearly 70-year-old news *regularly* in an attempt to rile up their readers is certainly similar, even if less egregious.
I think the NYT is quite biased, but it's not a random news story; it set the stage for lots of things. It's a piece of history, and they should reference it the same way they reference the collapse of the Berlin Wall. (That should be mentioned in a lot of stories about life in modern Eastern Europe.)
They're not reporting it like it's fresh news, are they?
I haven't read all of the stories mentioning Emmett Till, but the small sample I have seen are more along the lines of "this is like now, you should be upset" than "this is what things were like before" you might expect from a history lesson.
If a major newspaper mentioned the Berlin Wall more than once a week four a year straight, you wouldn't think that is weird and maybe putting too much emphasis on it? Mentioning it at the anniversary of its fall or something, sure, but every week?
I mean... I know you are concerned about what take-aways people will get from your comment section, but I think Steve's comment is pretty on point here.
Which is probably why Scott did not comment on the other similar comment, and merely requested that Steve limit himself rather than go away entirely.
I hope no one thinks I was telling Scott what to do. Obviously he is an expert at making an online community, whereas my blog hasn't even found its audience yet
I do not think you were being rude FWIW, just that I understand why Scott would want to limit some of his commentariat's obsessions.....
I'm sorry, I realize that a lot of people do their best thinking in the abstract or using trivial toy examples because they are more comfortable thinking in that manner, but I do my best thinking about concrete topics of major public importance. I apologize if that discomforts readers.
I have several fairly novel insights into media bias, but they largely come from my decades observing the most important media outlet, the New York Times, spin the most controversial topics of our time such as race and crime. Unfortunately, my brain is better at coming up with and remembering ideas about topics that are important, disputed, ad sensitive, so most of my better discoveries about how the dominant modes of modern discourse fail are tied to subjects about which many people get upset learning that the media's conventional wisdom is based on fallacies.
What I appreciate perhaps most about this community is seeing things pointed out to me, that were obvious the whole time, but that I'd never noticed.
My mental model of “lying” is the distance between what someone is saying and what they consciously attribute truth to. The longer that distance, the more lieness they have. If they just refuse to come to a conclusion, to cheat and jam the lieness calculator, I give them an automatic 30% lieness score with an “undecided” annotation.
So: “ Really savvy people go through life rarely ever hearing the government or establishment lie to them. Yes, sometimes false words come out of their mouths. But as Dan Quayle put it:
Our party has been accused of fooling the public by calling tax increases 'revenue enhancement'. Not so. No one was fooled.”
I like this, but there’s a missing piece. Substitute “fooled” with “betrayed.” Quayle says no one was betrayed because everyone understood the code. But collectively there was a betrayal of the information transfer process, by use of obfuscation. Obfuscation is always a tiny bit extra effort because it has to tilt the preferred direction. So there was effort made to present something not congruent with Quayle’s personal attribution. That’s a higher lieness score. Zero-consequence obfuscation is not a thing; if it wasn’t accomplishing something they wouldn’t do it. Maybe it was finessing attention away from the topic, make it slide by unnoticed, so whoever bothered to think about it would crack the code and not be betrayed, but more people would simply not notice?
So “really savvy people” are not experiencing constant betrayal, because they both pay attention and know the code. They may be able to change with the conditions and not sustain harm. But the lieness score for Quayle is still nonzero.
Unwillingness to score someone on lieness is not necessarily gullibility, but it is unwillingness; if I’m not betrayed either way, surely I can look at the nonzero lieness score?
“Clueless” may struggle to distinguish the code from the lieness. It may cobble together into a “likelihood of betrayal” score.
I don't think it's fair to indict Quayle for this statement. Notice he's saying "our party." This line sounds to me like a reprimand of his party for lying, wrapped up in the aphorism "It was worse than a crime, it was a mistake." He is saying, "Not only did we lie, we lied ineptly!"
I may be wrong, but more context to the line is needed.
In fact 70% of people thought Saddam directly responsible for 9/11, not just that he had WMD. That he had WMD was not a fabrication of the media but the political establishment- that or they actually believed it. The media was reporting what the administration said. I don’t remember there being much opposition to the idea that Iraq probably had chemical weapons, outside left wing anti war journalism.
Calls for invading Iraq started pretty soon after 9/11 and the media did conflate Iraq and the event pretty soon after. Bush apparently blamed Iraq within 3 days, the media implied or said outright that Iraq was responsible directly, or that Iraq helped Al Queda. 82% believed the latter. This couldn’t have just been the right wing media, it’s reach is not far enough.
https://www.brookings.edu/blog/order-from-chaos/2021/09/17/9-11-and-iraq-the-making-of-a-tragedy/
I was paying *extremely close attention all the way through*, and I am sure that, although the Bush administration wished very much for the American public to absorb the insinuation that Saddam was involved in 9/11, they successfully accomplished this without ever directly claiming that he was.
Yes. Very few of the media said it directly either. However the “threat of WMD” was often shown after a montage of 9/11, or there were pictures of the twin towers burning in the background during a discussion on Iraq, or people would mention 9/11 when talking about WMD getting into the hands of terrorists.
At the time, nuclear or large-scale biological terrorism seemed like a genuine threat. Since 9/11 was a massive escalation compared to anything that had come before, it didn't seem crazy that there might be an even bigger escalation. Al Qaeda had proven themselves more capable than we expected, and the very nature of terrorism had changed from a focus on threats-and-demands to outright "fuck it let's just kill infidels". Al Qaeda of 2002 definitely _wanted_ to start nuking Western cities, and with enough nukes floating around it didn't seem totally implausible that it might happen.
Now it's 2022 and we see that 9/11 was the high water mark for terrorist capabilities rather than being a harbinger of a new era. It looks like the War on Terrorism actually did work, in the end.
There was, at the time, the reported meeting between Mohammed Atta and an Iraqi official in Prague. This was the strongest evidence of Iraqi collusion in 9/11.
https://en.wikipedia.org/wiki/Mohamed_Atta%27s_alleged_Prague_connection
Only recently did I find out that this alleged meeting was probably a mistake by Czech intelligence; the Iraqi official probably met with a different guy called Mohammed Atta.
That story disappeared very quickly after it was found to be untrue, but it may have lingered in people’s imagination. The debunking had less fanfare then the original story.
Yes. That was the CLOSEST they ever came to a direct accusation, and I remember it well, but even if it had checked out it wasn’t enough evidence to say Saddam was involved in 9/11, and it never got corroborated anyway.
The other thing that they used to say a lot is that Saddam was directly involved in financing terrorist organisations.
And he was, but as far as we know he was only directly financing Palestinian terrorism against Israel, not Al Qaeda terrorism against the US.
Yes. But there was a very strong lobby influencing politicians to regard the two mentioned types of terrorists as equivalent.
I was surprised by this example in Scott’s article as well. I haven’t seen evidence that the claim at issue (that Iraq had been stockpiling weapons of mass destruction) was a lie by anyone, let alone by media outlets simply reporting the claims of intelligence communities. It turned out not to be right, but I haven’t seen reason to think that the intelligence communities fabricated it. And I would be stunned if Fox News had known it was false while reporting the claims.
My recollection from the time was:
Iraq had definitely had chemical weapons in the 1980s, because it used them on the Kurds.
After the 1991 war, Iraq had been banned from having them and UN inspectors had gone in to confirm this.
Iraq had been playing games with the inspectors so it was possible that they were hiding chemical weapons, but also the inspectors had never actually found any.
US intelligence had evidence that there were chemical weapons that were being hidden, but it wasn't all that certain and analysts within the intelligence community were split.
People within US intelligence, knowing that the President wanted evidence that there were chemical weapons chose to only present the analyses that showed that there were weapons and not the analyses that said that there weren't.
I am unsure how much actual lying was going on and how much people were fooling themselves.
It's complicated by most people thinking that WMD = nuclear weapons, when it includes nuclear, chemical and biological.
Yeah, there's a huge difference between having a thermonuclear warhead on an ICBM and having a barrel of leftover mustard gas hidden away with no effective delivery mechanism.
This is my memory of events as well. There was legitimate reason to believe that Iraq might, and likely did, actually have WMDs at the time of the invasion. That we never found them can still register as an "oops" rather than an intentional lie, though I am sure that there were officials and people in the media who were aware of the potential we were wrong and withheld that information. I would call that a lie, to share one side of the story and not the other, to intentionally produce an outcome.
My understanding is that Iraq had active chemical weapons programs, along with R&D programs targeting nuclear and biological weapons, up until 1998 when the Clinton administration ordered a bombing campaign against where they thought the Iraqi government was hiding their WMD programs. That bombing campaign was much more effective than we thought at the time, and Saddam decided to mostly abandon further WMD programs at least until he could get sanctions lifted, apart perhaps from some small scale stuff to lay the groundwork for post-sanctions resumption.
Following this, the Iraqi government attempted to send contradictory signals to different audiences. To the US and Europe, they tried to project the accurate impression that they'd abandoned WMDs, in hopes of getting sanctions lifted. But they also tried to convince Iran and domestic audiences that they still secretly had WMDs, to deter Iranian aggression and would-be rebels and to reassure the Iraqi military that the regime had the means to defend itself. The US intelligence community then picked up on the disinformation campaign and largely believed it.
I think you are being generous to the intelligence community. It was their job to know if Iraq had WMD or not, an invasion was and is a major event - probably the defining even of that decade. Also have a look at the office of special plans.
https://en.m.wikipedia.org/wiki/Office_of_Special_Plans
I don't think that they fell for it entirely for good reasons: motivated reasoning and institutional groupthink certainly seem to have played a massive role.
I forgot about the disinformation scheme that Saddam himself was using. He was trying to play both sides of that, and ended up misreading how serious the US could be about invading. I think in retrospect he could have come clean about not having any WMDs and saved himself from the invasion. But, if he had capitulated to US demands that easily, he would lose face in the region and possibly have rebellions or other invasions to worry about instead.
I read more recently that Saddam had chemical weapons whose mechanical parts had broken down. Then ISIS used those chemicals in homemade chemical weapons which they used against U.S. troops. The Army then covered up the resulting illnesses because Saddam had received the original weapons from Reagan (for use against Iran, IIRC).
Unfortunately I don't remember where I read this.
Wherever you read it, it was largely wrong. Iraq's chemical weapons were home made (these things aren't that hard to make) with the imported equipment mostly coming from Germany.
https://en.wikipedia.org/wiki/Iraqi_chemical_weapons_program
The Germans involved (private companies, not the government) didn't explicitly provide it for the purposes of making chemical weapons, but may have had a wink wink understanding about what they were likely to be used for. Three Germans were later convicted of export offences.
My recollection:
The US was angry like I'd never seen, even in the days of the Cold War, and that anger was focused on any faction that advocated international terrorism basing its enmity in Islamic tenets, wherever it was.
The Bush administration policy was consistent with addressing this anger. Seeking justice for 9/11 wouldn't be enough; the previous decades were a series of terrorist acts for which justice was sought, maybe obtained, only to be followed by more terrorism. To treat 9/11 as yet another police action was to continue the vicious cycle; US policy was compelled to address the root cause.
That root cause was widely seen as Iran. More precisely, the Supreme Leader and his supporters. There's a political cartoon out there depicting Hezbollah as a puppet operated by Syria, itself a puppet operated by the Ayatollah. All roads led to Tehran.
Trouble is, the Ayatollah was very well protected, self-sufficient, and in full control of the press that fed his support. If goal was to attack the root, the US would need to turn other Islamic entities against him. Saudi Arabia was already an ally; this is why it wasn't attacked, despite 9/11 being the brainchild of a Saudi. (Besides, bin Laden wasn't hiding in Saudi Arabia by then.) Libya was too far away, and the US needed bases in the area.
For a complex mix of reasons, the best first candidate was Saddam Hussein. Iraq itself already had reason to oppose Iran. Meanwhile, Hussein was a vocal supporter of Palestinian terrorism, even known to compensate the families of suicide bombers. Hussein was an easy villain.
There was actually a three-point case to make for an Iraq invasion: Hussein's repeated violations of UN resolutions; his human rights abuses record; his possession of WMD. For reasons I still do not understand - perhaps simplicity; perhaps belief that this had the most compelling evidence - the US focused primarily on the third.
Don’t get me wrong. The newspapers were still to blame. Their job isn’t to back government. There were people in the administration who were opposed to war, I posted a link to one, and people in the CIA who doubted the intelligence.
Everyone forgets there was a list of 121 reasons to invade Iraq. Not every one of them was an overt "act of war," but most of them were pretty bad.
One of the reasons I remember was Saddam sent assassins to kill former US President Jimmy Carter. Saddam was trying to build "the big gun", there's a TV documentary on this, basically a very large gauge cannon that was aimed at Israel. Saddam trying to buy uranium from Chad, we had a major internal diplomatic row over the investigation, where the ambassador's wife was outed as being a CIA agent—oh imagine that, an ambassador's wife is a government agent. The ambassador and his wife lied in the report about the results of the investigation—Saddam really was trying to buy uranium from Chad—contrary to her findings. Saddam's sons used the primary school system to collect young sex victims ... and daddy has a plastic shredder repurposed as a people shredder to take care of any complaints. Saddam used chemical weapons against his own people, the list as I said was 121 line items.
I’m not forgetting that. But the 9/11 confusion was intentional because the other reasons were either bogus, or insufficient legally for actually invading a foreign country!!
“The President’s son rapes women and doesn’t get prosecuted for it” is a terrible thing but international law doesn’t recognize that that justifies a foreign country bombing and invading and occupying and installing their own regime!
Well, there's no such thing as "International Law", and the big problem in the leadup to the Iraq War was trying to persuade the sort of people who believed that there was.
I was a big Iraq War proponent at the time (I was quite young). The argument that convinced me to support it was basically "Anyone who wants to overthrow a dictatorship and install a democracy is alright with me; this is a hostage rescue situation rather than an invasion"; the actual WMD issue wasn't that important for me personally.
I don’t believe in “International Law” as something that exists in the absence of specific charters and treaties, but charters and treaties are things that exist, and the UN was pretty clear that they did not consider their charter to justify the actions of the US in Iraq. More generally, the US has a very regrettable tendency to justify bombing and invading other countries based on rhetoric about “bad guys” and “evil”, while using the words “democracy” and “dictatorship” very selectively in a region of the world which includes such countries as Saudi Arabia and Iran.
The fact that bogus justifications were fabricated tells you practically everything you need to know about the legitimacy of the war. I used to believe that the intelligence agencies simply made a mistake about Saddam’s intentions and capabilities, but with hindsight it has gotten clearer over time that this was “decide on war first, assemble reasons later”.
The yellow cake story was a total fabrication, although it was one of the main reasons for war. See:
https://spyscape.com/article/saddam-husseins-fake-uranium
No uranium was found in Iraq except the decommissioned weapons from the early 1990s. Valerie Plame was outed by a Washington post journalist because her husband had, on his trip to Niger, found no evidence of those sales and had written an op Ed in the NYT to that effect.
The uranium story wasn't fake, and you have made omissions. Saddam hadn't bought uranium from Chad. But Saddam did send agents to Africa to try to buy uranium. This is what the kerfuffle was about. Valery Plame wasn't in Iraq looking for uranium, Valerie was in Africa looking for evidence of Saddam's uranium buyers.
Its like saying Joe hired someone to murder you ... well, the murder never happened, so why are you upset ... just what business do you have, saying Joe should be tried for murder, when its obvious the murder never happened?
Mobile truck mounted uranium processing plants were found, destroyed, and people were poisoned by salvaging contaminated containers from the wreckage for household use.
Saddam didn’t buy any uranium, and that was the case against him. Not buying uranium is not having WMD. The US didn’t present as its case for war in the UN that Iraq tried to buy uranium but failed - and even that story is dubious.
And your analogy falls down. It’s as if there was a law saying you could kill a guy if he bought a gun to kill you - which would be absurd in itself - but instead you kill him because he tried but didn’t buy a gun.
I took Michael's point not as that Saddam was guilty for having a gun he failed to get, but rather that he was planning to hurt someone in the first place. Last I checked, conspiracy to commit murder *is* considered illegal.
The public's reasoning was more or less:
1. We want vengeance on the perpetrators of 9/11.
2. Our leaders want to overthrow Saddam
3. Therefore, Saddam must have been a perpetrator.
That may be how they reasoned then but it’s not how they would think about it now, far too much trust has been lost.
But we had already invaded Afghanistan.
Why would the government saying that the harvest will be good instead of glorious mean the harvest will be bad?
It's like when you are asked to comment on a colleague's skills and you say "hmm yeah, he's OK I guess", you are conveying that actually the colleague is an incompetent oaf and you'd be happy to be rid of them.
Specifically in Soviet circles you could be punished for saying things against the state or against communism, so people learned to say things very positively, but less positively than they might. It was a code, as Scott says, because you weren't allowed to speak honestly using the correct words.
It’s like when Donald Trump says they are “fine people” on both sides as opposed to saying “these are glorious freedom supporters” for the people on the right wing side of a conflict. He needs to say something positive but he can’t bring himself to say anything more than “fine people”.
In my experience it’s clueless people who end up being the gullible ones when, in the throws of their paranoia (fear), they fall for a conspiracy theory or magical religious thinking. They want to believe it. Those types are perfect marks for grifters/scammers.
For the Lincoln example, you can argue that the journalists *know* most people don't read past the headline. So the speculative piece was an excuse for push the myth Abraham Lincoln was into Marx in the headline. But I accept the wider point.
For the science case. If you take something like the causes of Autism, the public have a great interest in it but are led to believe its just some random great mystery. The actual science is in the position of now making some definitive statements about likelihood. But none of this is propagated to the public lest it make unsavoury geneticists look correct.
They know that most people don't read past the headline, but they think that most people read headlines the way they do - that you read the headline to decide whether or not to read the article, and you know that the headline itself has no informational content, so if you choose not to read the article, you have learned nothing from the headline.
They know that headlines are clickbait, ie the purpose of the headline is to convince people to read the article - an article that may well then correct the view that the headline instilled.
They are *wrong* about this, but that's what journalists believe.
This is a great post. It would be even better if it explicitly acknowledged that the rules change, and that we are living in a time in which that change is rapid.
Rapid change should, and often does, undermine people's confidence in their ability to discern what is true from what is reported.
I'd highlight one particular change as having been explicitly planned and having backfired spectacularly:
Before Trump, most quality media organizations were committed to reporting on events neutrally. They always presented both sides of the argument, and avoided drawing conclusions.
The argument was then made that if one side is lying through their teeth and the other is telling the truth that this approach may serve to mislead more than inform. This sounded emminently reasonable to me in theory, and it came to pass.
Unfortunately, it has not worked out very well in practice. Being freed from presenting the other side's arguments has led to a great deal of disinformation and severely compromised my default trust level in articles appearing in the New York Times and Washington Post.
I suppose this is better than most such changes, in the sense that it was at least explicitly discussed and thought about.
Or maybe not. Maybe this illustrates how little value explicit discussion actually has, since our collective wisdom is insufficient to avoid serious harms.
I agree. Even the idea of getting "news" from "Fox" has changed. I'm guessing that most people aren't going to "Fox" for the "news", but getting it in their Feed. So, as one scrolls through their feed, are they able to do the kind of code-switching required to filter "fox" "news" properly. Also - due to the rapid change you refer to, new generations may be ingesting "fox" "news" in very different ways, which would also suggest that Scott's way of thinking about the topic may be antiquated.
I think we might have our heads in an echo chamber. As Forbes says Fox is the second most trusted news source: https://www.forbes.com/sites/markjoyella/2021/08/09/cnn-msnbc-drop-in-trust-ratings-as-fox-news-channel-rises/?sh=213c441527c0
I thought that the quoting indicated that 'Fox' was being used as a stand in for [insert heavily biased media source of your choice]. I think most folks would agree that Fox is widely respected as a news source (though perhaps it shouldn't be).
Good observations!
That said, I think it's fair to say "this is a game I'm not interested in playing". That's my stance. I feel confident being able to tell the difference between the cases more often than not, but since most of the news is not interesting to me anyway, I just don't expose myself to it. I don't need to constantly worry about getting it wrong in the edge cases and waste brain cycles on that.
Given the rare scenario someone wants my opinion on something from the news, I can offer my abridged first impression thoughts based on their summary with disclaimers, or I can dig in then. This has been working well for me, but whether it does is necessarily dependent on one's social circle. (There are some where even the disclaimer "I don't actually know anything about this yet, but from what you've told me," might prompt outrage.)
But I think a lot of people who distrust the news distrust it for the stories it *doesn't* tell. For example, my instinct on reading this article's first summary on the Lincoln/Marx topic was "and how many friends did Lincoln have? Is there evidence he favoured Marx's views any more than some others?".
Similarly, when the news tells me, for example, about some bad thing [big corporation] did, but doesn't let them speak up, I wonder if the corporation has an actual reasonable justification for their actions that's being swept under the rug (sometimes they do, sometimes they don't). Same with political parties, nation states, et cetera.
And I suppose sometimes they also do just screw up and "lie", but I honestly get the impression that's just because humans are involved and humans sometimes make mistakes - it's typically not an attempt at fabricating facts. (Granted, that might be an observation true for the Tagesthemen in Germany, on which I'm basing most of my opinions, who seem to at least *want* to take journalism seriously. Fact-checking can be hard, even for big players, but it's a very, very rare event that they screw it up completely.)
See also Scott Lawrence's comment, which gets into that failure mode.
Presumably at some point you're going to be faced with decisions like "should I evacuate in the face of the hurricane that's about to strike?", or "should I lock myself at home for the next two weeks while a deadly plague sweeps through town?" or "is World War III about to start?" For that, you'll need to know whether there's a hurricane or a plague or a major international crisis in the works.
And while there are reliable specialist sources for each of those, they are specialized and it would be intractable to follow then all. So unless you're planning to ignore the world at large until you e.g. suddenly notice the roof blowing off your house, you'll need some ability to look at a general news source like CNN and differentiate between "this is important, actionable information" and "this is hype". That's the game, and if you don't play it, it plays you.
"or that, you'll need to know whether there's a hurricane or a plague or a major international crisis in the works." These sort of problems have yet to arise without me finding out through completely different sources. I am perfectly willing to continue assuming this will be true. So I'm afraid you've done nothing to convince me that I need to play this game - but this definitely may be different for other people, and is a good caveat to keep in mind for the general case, yes. :)
Grammar nitpick: “then one extra mass shooting” -> “than”
Didn't Scott already write this essay? I think it might have been part of a much longer essay on another subject, and in the other version it suggested that middle class people, being one step closer to, and hence having a better mental model of, the sort of people who actually have power, are better at sorting the lies from the not-quite-lies.
To pick on the examples, though, I think you have far more faith than I do in the Washington Post's reporting on election fraud. I'm not saying that there necessarily _was_ massive fraud, but I can't see any mechanism by which the WaPo would be inclined to look into whether there was; as an organisation, the Washington Post had a fundamental incuriosity about any story that might help Trump (what _was_ the deal with those Hunter Biden emails anyway?), so they have no more interest in finding out whether there was electoral fraud than the Swedish government has in finding out whether immigrants commit more rapes.
The linked article is a perfect example of why I can't trust the WaPo's reporting on this subject, it's incredibly disingenuous. As slam-dunk proof of the paucity of fraud, it offers the fact that only a small number of double votes were found... but double voting is the dumbest and most blatant form of electoral fraud there is; I'd like to know how many mail-in ballots were stolen, either before or after delivery, and how many ballots were "harvested" in suspicious circumstances.
This article is the equivalent of "Kangaroos don't exist, I checked my back yard and my front yard and didn't find any".
"I can't see any mechanism by which the WaPo would be inclined to look into whether there was; as an organisation, the Washington Post had a fundamental incuriosity about any story that might help Trump"
If they shout from the rooftoops that there was 100% definitely no election fraud and anyone who thinks so is crazy, then some Watergate tapes drop and it's proven that there actually was, they're going to look very stupid. Most news organizations care deeply about their reputation and I propose this as a mechanism which limits the rate at which they make unverified factual claims. Someone in the editor's room is getting paid to think "Wait, but what about ballot destruction? We'd better make sure that doesn't blow up on us."
News organizations shouted from the rooftops that Trump-Russian collusion was a well-established fact, and that the 'real damning evidence' was just around the corner. It was later revealed to be a fabrication by opposition political operatives and they paid no reputational price for it. Indeed, some continue to make the assertion, despite the evidence not turning in their favor.
This and a dozen other examples where collusion to peddle a 'narrative' ended in discredited stories has not hurt specific news outlets. Part of the protection in this game is that they all echo each other. There's safety in a crowd. So long as everyone tows the same official line, nobody gets punished for getting anything wrong.
And this goes back well before the days of TDS. Remember Bush Jr. and the absolute confidence that Hussein had WMDs? Nobody paid a reputational price for blindly repeating that line either.
Did newspapers says Trump-Russia collusion was established fact, or was that pundits while the papers stuck to leading headlines and carefully hedged "According to the Grobnatz report..." statements?
Actual question, I'd like to see how much you've got on them committing to falsified claims on the issue. I don't know of any but journalism is big so I expect it happened *somewhere*.
I don't save my sources as closely as many here do, but I did read Attkisson's recent book on the subject: https://www.amazon.com/Slanted-Media-Taught-Censorship-Journalism/dp/B0854MY6SM/ref=sr_1_2?keywords=attkisson&qid=1643217070&sr=8-2
I remember a lot of the stories she cites in the book, and was surprised at how many of those stories were outright lies - not just careful hedging. She identifies a number of statements that were later proved false, but were either not retracted or whose retraction was a minor footnote buried in section Q while the main story got front page. (More often there was no retraction.) She also demonstrates how the authors knew or should have known based on evidence available at the time that what they were reporting was not true on its face, or didn't stand up to even minor scrutiny. That falsehoods were credulously reported without any attempt at verification or falsification.
That's not to say the news media has stopped the sleight-of-hand word choices, just that there's no longer the line in the sand Scott claims they're unwilling to cross. The claim that "news is separate from opinion" is no longer true. A lot of opinion is now reported as news in the news section by news writers stating opinion as 'fact' without citing a source.
Can you find a specific example regarding Russiagate in particular?
Again, I don't collect references that assiduously. I recall more than one example in the book, though, specifically regarding Russiagate. It wasn't just a phenomenon of accidental erroneous reporting. Certain people knew they were spinning fabrications into legitimate news channels and did it anyway.
Try this out: https://taibbi.substack.com/p/master-list-of-official-russia-claims
This isn't even a complete list, as Taibbi himself has included others in different articles. I think it got so long he was tired of updating it continually.
> If they shout from the rooftoops that there was 100% definitely no election fraud and anyone who thinks so is crazy, then some Watergate tapes drop and it's proven that there actually was, they're going to look very stupid
This doesn't bother them much. As sclmlw pointed out, they've never faced any consequences for being badly wrong in the past, so why would they in the future?
Besides, Watergate only got revealed due to the WaPo putting investigative resources into the story. If nobody ever investigates electoral fraud in 2020 then it will never get reported on.
About election fraud, and the possibility of a newspaper finding it: I saw comments online from a man who did election-monitoring in Iraq during their first elections after the fall of Saddam Hussein's government.
That man had lots of training from portions of the U.S. Government in how to spot indicators of fraud. Some of those indicators included things that actually happened, at a local level, in the 2020 elections in the United States.
Among those indicators are: heavy uses of Absentee Ballots outside of the limits prescribed in law, irregular practices in handling Mobile Ballot Boxes, election observers ejected for a portion of the count (or told counting stopped, only to discover counting continued while the observers were gone).
This isn't slam-dunk evidence, but it is suggestive that the elections were not 100% free of fraud.
Or of course you could just learn something about the underlying measureable facts of the situation, and be able to judge when the experts (or politicians) are shading and when they're being straightforward -- on a sound *empirical* basis, and not via either the amateur social psychology you hopefully picked up in your mother's milk and/or School o' Hard Knocks, or via a Jesuitical parsing of the exact linguistics.
I mean, this is what we do elsewhere. If I want to know which financial pundits are lying through their teeth[1], the best advice is to to dig in and learn something about finance, stocks, options, et cetera, master the vocabulary and math, and start paying attention to ticker symbols. Basing some critical judgment on the social psychology of journalists, or an elegant reading between the lines of their prose, is a very poor second best.
--------------------
[1] Spoiler alert: all of them.
How are you going to learn anything relevant about the underlying facts of e.g. which towns (if any) saw mass shootings yesterday?
Well, let's see, I would probably start with learning some basics about guns -- what kind there are, what kinds are legal, that kind of stuff, so when I read a report I would have some background info that let me critically evaluate reports of the use of "an assault rifle." I would also have taken some modest time from adolescence, roughly, to pay attention to the very many crime reports that come from many different sources, so that I had years to decades of background info on the approximate normal rates of murder, and how they depend on location, what kinds of motives, correlations with gangs and drugs. If it were an issue in my city, I might go to a few city council meetings where the issue would undoubtably be discussed.
Then if I was not immediately familiar with the unfortunate town in the news, I might dig into what kind of town it was -- lots of places to get that info -- and think about whether what I already knew about the correlation between violence and nature of the burg made sense for this particular city, e.g. if it happened in Philadelphia I wouldn't be at all surprised, but if it was said to happen in Del Webb's Sun City in Retirementville UT there might be some heightened scrutiny I'd bring to bear.
Et cetera. This is just off the top of my head, mind you. If this were an important issue to me, and I really wanted to be able to form an independent basis for judgment, there's a ton of research resources I could access without moving my overweight ass from my desk chair. Click. FBI Uniform Crime Report, correlated by age, race, sex, location, clearance, et cetera. Another click. About a ton of think-tank studies on violent crime, and even specially on mass shootings. Another click: info on weapons used, by people from a dozen viewpoints. We live in an era of information cornucopia, if you have trouble finding independent data sources that touch on mass gun violence in 2022, I suggest you're not really trying very hard.
But how are you even going to get to the "unfortunate town in the news" part, if you're assuming that the media can't be trusted to tell you the names of the towns in which there were recent mass shootings and it's all on your own personal understanding of the issue?
I don't think in such black and white terms. That's not a sound basis for empiricism, that's a scholastic kind of viewpoint, which I reject. There is considerable daylight between "everything the media says is false" and I have no idea -- no corroborating evidence from my own experience -- that lets me evaluate how likely it is this and such story is true, so I guess I have to wander off into hairsplitting the exact terms and speculating on the psychology of the authors, like some medieval monk trying to infer the nature of the chemical elements by studying every syllable in Plato.
You're not a kid fresh out school, I'm sure you know how to do this, so what are you trying to say? We all weight the credibility of testimony, all the time, in our ordinary lives. Not everything my colleagues at work say can be 100% trusted either, nor people in my community, or strangers, salesmen, contractors, nor even my own friends and family, and so in each and every case I need to weigh up knowledge I have from my own direct experience to adjust my credence. I can think of no ordinary part of my life where my only choices are blind faith or a scholastic navel-gazing analysis of only the communication itself -- where I have *no* empirical experience bearing on the subject to assist. Still less can I think of any part of my life where it's *important* to me to learn to evaluate the testimony of strangers critically -- and I can really think of no background knowledge I could myself gain that would ease that task greatly.
FWIW, Samnytt.se (the news outlet referred to in the "immigrants' crime in Sweden" part) is basically a Swedish version of Breitbart News. Radically right-wing, racist and with a VERY lax view on journalistic integrity and – which the entire blog post is about – the truth.
I asked about this in an Open Thread and some Swedes said that Swedish-language sources confirmed it was basically true.
Not saying the basic premise – researchers find link between immigration and crime and get heat for it as a result – is false, just that this particular news outlet is not trustworthy. (Source: am Swedish too).
But you're agreeing that the story is true. Is there any reason you think it's untrustworthy beyond it not being aligned with your ideology?
Not think. Not related to ideology. Knows from experience. But again: it doesn't take away anything from Scott's main points. Just a little unfortunate to use this particular source for the argument.
I think that makes it an even _better_ example! You're claiming that you "know" they're "untrustworthy", and I accept that you're sincere and probably not trying to deceive anyone, yet you also agree that the particular article referenced is (basically) correct.
If anything, it's an even better ('double') example for the argument being made.
No it doesn't. Every EU country has these types of "alternative news sites". They are known for very actively lying; basically their whole premise for existence is importing the U.S. culture war BS to Europe. Some are funded by Kreml directly, and their servers are hosted in Russia more often than normal news sites'. The official side calls all of it "hybrid warfare". Personally I don't take it so dramatically, but still basically ignore anything those sources write; they're usually very open about it because they know their target audience. That article too, I checked the "About us" before even starting to read, so I didn't. (To be fair, they're hosted on a U.S. server and perhaps the site layout isn't as attention-grabby/spammy).
Okalmaru put it very well: this was a very unfortunate choice of source from Scott.
Reminds of early days of internet where "Google" or "Wikipedia" might have been quoted as a source. What you CAN do with these sites is take their links and follow them. Read the originals and make up your own opinion. They often raise interesting issues (prosecution due to ethics of research) but then totally misrepresent them in the text (all of Swedish science-production going through some kind of woke censorship). Sometimes the issues they raise are non-issues that is adequately and understandably explained by simply reading the original source.
What Scott did was quote something is like Wikipedia: not a source, not a journalistic product, but just someone's opinion in the internet. Like a random comment in a random comments section.
At the same time, this might be a bit of a failure from the Swedish mainstream media's side. Trying to find mainstream newspaper coverage, I found e.g. https://www.aftonbladet.se/nyheter/a/6zlzLr/studie-det-utmarker-en-typisk-valdtaktsman https://www.dn.se/sverige/kriminolog-kritiken-mot-bra-rapporten-ar-valdigt-farlig/ which are behind paywalls. Maybe those paywalls also give these "alternative news sites" extra visibility. Just googling Khoshnood would only show them on the first page (that and a Norwegian alt-news site).
Yes, both these things are true: Samnytt is an alt-right publication, but their reporting was essentially correct anyway in this case. So it's a case of "not wrong, but a more credible source is preferable anyway" (if for no other reason than to avoid having to have this discussion first every single time).
If you want a link to a respected newspaper, you can use this:
https://www.gp.se/ledare/f%C3%B6rs%C3%B6ker-staten-stoppa-obekv%C3%A4m-forskning-1.58037037
Or "Swedish Reuters",
TT: https://tt.omni.se/flertal-domda-valdtaktsman-har-invandrarbakgrund/a/vAn6Xj
You are also absolutely correct that the argument against the study isn't erroneously saying it's false, but claiming that these kinds of studies shouldn't be done as they will have harmful effects, that the person performing the study must be bad or they wouldn't have made it in the first place, desperately searching for a legal argument to discredit the scientist (there is exactly zero percent chance that the legal technicalities would have been an issue if the study had reached the politically correct conclusions), and so on, and so on.
One of the most strained arguments, from the National Council for Crime Prevention, was "maybe not drinking alcohol creates more rapes?" Yes, _really_.
Ah, thanks for those links. My google-fu wasn't strong here.
But even so, looking at the TT you get a normally written news piece that isn't attention-grabby like many of those "alt news sites". Nothing about the censorship case (maybe it's another TT article?).
Samnytt text just was colored with more outrage, precisely the toxic stuff that prevent me from looking at Twitter directly (I only open links to it from "trusted" sources, like IRC, or here). It's not just that facts are right, the tone is for me what makes it spammy or tolerable.
GP's columnist didn't do interviews like Samnytt seems to have done. So in that sense that Samnytt's article was superior to this GP piece, since they got the voice of the researchers in the writing. It's sad when traditional media can't do their job more professionally.
How are they racist in a way that Scott or any other person who knows what they're talking about is not? How are they radical?
And the entire blog post is about how mainstream journalism has pretty shoddy integrity too.
I don't see a point in going into detail about this particular outlets' agenda here. It's not what Scott's post, which I generally agree with, is about.
I think Samnytt is heavily biased in *what* they report on.
I'm not aware that they publish any known falsehoods, though living in California I'm not the biggest authority on that.
Which is to say, it seems to me they follow Scott's heuristic pretty well.
Ten years ago, during the whole muslim psychosis, I do remember this sort of whole cloth lying a lot, though.
Does anybody else remeber all those stories about "no-go-areas" in Europe? All the while there were Europeans *living in those very areas* on the internet yelling that this was crazy?
I mean, "no go area" is pretty subjective. Most major American cities have areas that random middle class whites would be advised are "no go areas"; this doesn't mean that you'll necessarily get killed every time you venture in there.
Having said that, the fact that an area is "no worse than a US ghetto" is no consolation to someone who never had anything as bad as a US ghetto in their city until fifteen years ago and now finds themselves living next to one.
Yes, but I doubt comparable areas exist in most of Europe at all
Not as deadly as in the USA, but there are certainly high crime areas where firefighters and ambulance drivers need occasional police protection and public transport is sometimes attacked. In Brussels 30 local youth held two police officers and prevented them from calling backup for some time, until one of the officers could talk them down in arabic. This might not strictly be a "no-go-zone", but certainly a "thread-very-lightly zone"
Same for Sweden. There are places you need to send two police cars every time, since if you send only one, the car will get vandalized once the police are out of sight from it. Police escorts for ambulance, attacks with rocks from overpasses and bridges on both, and so on.
Fun anecdote: My brother is an army officer who had to organize patrols in Brussels and Antwerp after the attack on the Brussels Jewish museum. He got maps with certain neighborhoods marked as "no-go" in red. Though this has probably more to do with someone higher up not wanting to stigmatize or provoke the local population than literally being to dangerous for the army to enter.
Most claims I have seen regarding "no go areas" indicate that these areas are not patrolled by the police, with a very strong implication that even (or especially) the police are afraid to go there. That's not terribly subjective; you can just count the police cars and uniformed officers.
Well, *someone* can. It might be annoyingly tedious and expensive to fly over to Sweden or wherever and do it yourself. But if you e.g. have access to a blog where smart nerdy types from all over the world gather to talk about whatever interests them, you could probably ask if someone has local knowledge of the matter.
Oh yeah, that was fun to watch from London. It was, like, guys you get that no one has guns here, right? Gangland warfare doesn't mean that everyone is hiding behind their engine blocks from the hail of bullets, it means a bunch of teenagers got into a knife fight outside Asda.
Yes, we also don't much care. Having to hide from criminals who might stab us to death is only slightly less frightening than having to hide from criminals who might shoot us to death.
What would actually be useful information would be the extent to which your criminals preferentially target only other criminals (because attacking e.g. tourists would bring major heat down on them all) vs. preferentially targeting outsiders (because e.g. that cements their territorial claim and touristy outsiders in particular carry extra shiny). But that's not the information that is usually being offered by reliable sources, and it can be hard to track down.
I think the difference is in falsifiability. The no go areas claim is sufficiently vague noone can conclusively prove you wrong, at least without a long argument about defining terms. But if you say "x people died" or "x person did it" those can be falsified, and in the latter case you can be sued
I was living in one of those areas, and I was not yelling that it was crazy. To me it seemed like a serious problem that firetrucks and ambulances needed police escort when entering the area.
Excellently written. Your gift is appreciated. Reminds me of my youth when George Bush Sr. laid out his doctrine on a New World Order. It was like God had finally spoken…then my grandfather educated me about the use of New World Order in history. Damn.
A quick note on the recent, raging "Expert Failure" debate.
It appears to me that the experts have suffered a corruption of the systems they are a part of. Without getting into the weeds on what corruption means in this context, let's say that the reputational risk:reward on honest communication has become such that honesty is heavily disincented. Some of us have heard countless examples of "behind closed doors, my expert friends say so and so, but they wouldn't dare say it publicly" in the last couple of years.
So I suggest a solution to this: how about anonymized expert networks? This way, we get to hear from the experts, without any risks to the experts. Kind of like the semi-dark expert networks that private equity shops heavily lean on.
Similar to Metaculus, but with (an apolitical, test-based) screening for expertise and a focus on deep insights versus predictions. Would be nice if Bill Gates or some billionaire would set it up and provide compensation to the experts.
In our desperate search for truth in a post-truth world, filtering for expertise and adding anonymity may get us closer.
If it's accessible to the general public, there would be a lot of pressure to shut down such things. Just like currently social media companies are pressured to censor certain things/people.
I doubt that the general public would have much interest. Few have the attention span for such stuff.
I don't think much of the general public is actually going to consume InfoWars or Stormfront. But people will work to make both inaccessible.
There's currently a forum called "Econ Job Market Rumors" where anonymous grad students gossip about job opportunities, and also badmouth certain econ papers & economists. EJMR is considered a scandal because these anons will write offensive, politically incorrect things. It gets blamed for creating a "toxic" environment in econ:
https://equitablegrowth.org/should-read-justin-wolfers-evidence-of-a-toxic-environment-for-women-in-economics/
Laymen wouldn't bother reading EJMR at all, because they don't care about the job market there. But if it weren't focused on the job market, it would be considered a threat not just to women in econ but the general public.
Interesting. Do you think EJMR is at any risk of censorship or being shut down?
I wonder if Web 30.0 will have a solution for this
Our robot overlords will delete all false data.
Sims who believe false things will be deleted.
Excellent article!
What makes it even harder to "know the game", is that it is not just one game, but that every scientific community develops their own rules. Climate scientists follow a pretty different set of rules than neuroscientists. If you are savvy enough to read papers from climate scientists, that does not make you savvy enough to read papers from neuroscientists.
And of course, all the same for journalism. Tabloids follow different rules than broadsheets. The science part of a newspaper follows other rules than the politics part or the sports part.
Being savvy includes knowing which articles and statements you can interpret right, and which ones you can't.
What rules? I've never seen such a thing in action. Do you have some kind of illustrative example? I'm neither a climate scientist nor deep into neurology, but I have no great problem reading papers by either. It's still English, not Sanskrit still less Linear A. There's often a ton of unfamiliar acronyms and such you have to look up, but that's why God invented google.
To be sure, I am not going to be in a position to make some finely-tuned judgment call of whether this shade of conclusion is 5% more probable than that other -- the kind of thing that exercises the people right at the frontier, leads to dueling 30-min talks at the next big conference. But this is a long way from being completely unable to grasp the degree of solidity with which major broad cross-cutting Claim Foo is seen to possess. I've never found that to be a big problem, if I'm willing to put in the time required to bone up on the terms of the discussion. In what sense is this some kind of opaque process, where even an expert in one field is shut out of grasping what's going on two fields over? That really doesn't match my experience in science, which is that surprisingly distinct fields have *more* in common than one would naively think.
I have switched fields from computer science to neuroscience at some stage of my career. At the beginning, I was pretty much lost, and the big changing point was that I found someone who could tell me things like "yeah, don't trust this paper, they claim that they count synapses, but it is not really synapses that they count".
Another classical "lie" in neuroscience are statements of the form "region A projects to region B". Of course, it is not a complete lie, but the truth is usually closer to "the connection from A to B is slightly stronger than the connection between two average regions".
Of course, that is fine because experts know these caveats. For example, they know that every computer model that is based on such anatomic connections must be heavily discounted. Authors of modelling papers know that, too. But in the paper, the only sentence about this is something like "Using anatomical data, we model the projection from A to B". I think that a typical modelling paper is pretty misleading for an outsider.
Or yet another example, Scott wrote in his other recent article on the EEG study, "I'm skeptical of social science studies that use neuroimaging". I agree with that, and I would more generally take such caution with neuroimaging studies, even in neuroscience. But that is specific knowledge about neuroimaging, not general knowledge about science.
Er...sounds to me like you're saying when you switch fields you start off rather naive, and need to learn a bunch about the new field, including the working definitions of many specialized vocabulary words, before you can usefully contribute. It's difficult for me to imagine any situation or social structure in which that would *not* be true. Experience and experientially-derived knowledge are a thing. That's why we can't learn everything important from a book, or Wikipedia, and I imagine every expert in every field would say that is true about his particular field. If you decided to be a plumber or house painter or grow wheat you would also need a great deal of experience-derived knowledge before you were able to do the job competently and efficiently.
So this is a long way in my mind from saying that scientific or technical fields are deeply tribal, e.g. that There Are Rules of how you can and can't say things in this community and they're not the same as in this other community -- sort of the way things are in the ideological aren a, e.g. if you're calling yourself as "Rationalist" or "part of the reality-based community" or "woke" or "red-pilled" or just "blue" or "red" (in the US) then there really *are* unmentionables and shibboleths that cannot be questioned. Very different situation, to my mind.
Scott's point was not that newspapers are tribal. If anything, then it is the opposite, because the red lines of "red" and "blue" newspapers are pretty similar, like not to report falsely about an official police statement.
And even for The Rules, I don't think it's so different. There is an informal code for how you are allowed to criticize other people's work. I would claim that is not allowed to write "We should be skeptical of modelling studies" in a neuroscience article. (In peer-reviewed articles. It is totally ok to say that in private conversations.) It is allowed to convey this message in an article, too, but only with specific formulations. And I don't think it is trivial to understand such statements right if you are unfamiliar with the field.
I don't think it's just general politeness either. My impression from computer science is that there are less taboos about how you can criticize other people's work. If something's wrong, then it's wrong.
Well I'm doubtful a priori. I would instead guess you are perhaps wrongly interpreting the resistance you are seeing, the actual root is "you don't know enough to be able to critique this or that accurately yet" and not "you aren't allowed to say it or say it this way." You're jumping to the social explanation first, because that's the easy one, the natural human go-to explanation for weirdness, whereas it's more likely in my experience that the issue is that your familiarity with the facts of the field is as yet insufficient to let you make your point with the nuance people expect.
It's like if I wrote a paper challenging the Higgs mechanism by saying "this is all bullshit because there are other possibilities for the data, e.g. these three" -- and proceeded to list three that had been long ago carefully considered, because my familiarity with the field was yet limited -- I would get strong pushback. I could interpret that as "you're not allowed to criticize the dominant paradigm" but the real reason would be "if you're going to criticize the dominant paradigm you have to have all the background at your fingertips so you don't do it in an annoyingly boring way where people have to point out yeah that issue was raised and thoroughly discussed in 1973 so RTFM wouldja?"
Funny, my impression of computer science is the opposite. My impression of programmers as a tribe is that they are unusually brittle, psychologically speaking -- have a much harder time accepting that not everybody can, or ever will, agree on The One True operating system/way of programming function X/correct way to criticize other people's work/acceptable way of asking the girl at work out. They tend to insist on black and white even long past the point where the rest of us, given the muddled state of evidence, agree to call it a shade of gray somewhere between your preferred Pantone and mine. They tend to get into orgies of debate over The Rules because they are far less flexible about the role of rules in behaviour than other tribes.
There are a bunch of unwritten rules in journalism that are helpful to know.
For example, when I started reading newspapers during the Nixon Administration, I saw frequent references to "an unnamed senior Administration foreign affairs advisor said..." I assumed as a child that this could refer to any one of a few dozen officials. I only found out years later from reading Henry Kissinger's memoirs that it meant "Kissinger."
This is one of those posts I can tell are true and important because it leaves me totally uncomfortable and unsatisfied.
Let the cognitive dissonance flow thru you! :)
Agree. It's also got a comment section full of the "I totally agree with nearly everything, but here's why your examples about my side of the culture war are wrong and bad, unlike your examples about the evil people on the other side" nonsense that's making it close to unreadable lately.
That sounds like a fully general counterargument. Let's say one side really was worse in some aspect - we should be able to discuss arguments for why that might be the case. (see "Bulverism")
That might make for interesting comments on a post about asymmetries in cognitive errors in groups with differing political and/or cultural beliefs. But when the comments are full of the butthurt complaining about their ox getting gored while Scott, who is clearly a [communist/fascist/aren'ttheythesamethingreally], is not goring the other guy's ox, it's not very interesting, just tedious. [edited for clarity]
Not, it's just not seeing the forest for the trees. The article was full of scissor statements. The point of the article was not to get hung up on particular controversial topics and see the larger point - because there is a larger point. But that's something that goes very much against our nature, requires executive function, i.e. effort and discomfort. That's why I said that feeling uncomfortable is a good proxy for it being useful.
Those interested in a rigorous look at immigration as it relates to sexual criminality in Europe should read Ayaan Hirsi Ali's latest book, Prey: Immigration, Islam, and the Erosion of Women’s Rights.
I agree with Scott that it is a really important skill to "bound your (dis)trust" when interpreting public statements (or your friend Tina when she says the food at this new restaurant is great and you should go there).
What I disagree with is the sense I get from the article that this is a binary skill (you get it or you don't). I think this is a very hard task, everybody struggles with it to some degree and it is often impossible to figure out what's the right amount to trust (or what the exact bias is). Your own priors will also determine how much you should trust someone or what to take away from the statement. Really, its just a special application of Bayesian updating and we know how easy that is in practice.
Case in point, I think "the WP says the election was fair" should be compared to "Saddam has WMDs" rather than "mass shooting in NY". Why? Because these are the two cases where the media coverage, to a first approximation, can be explained by the fact that it repeats the official governmental position on issues where the media have a lot harder time if they wanted to endorse a different position (much like the problems the swedish immigration crime rate study experienced). So if somebody is convinced that the election was rigged despite all official bodies saying it wasn't, the WP article isn't going to change their mind based on bounded distrust.
I think the "binary" part of the skill is whether you trust that you're 'good enough' to extract any useful info generally, not that the skill itself is binary.
I think you're correct, but I disagree that it's a good conclusion. Knowing that we can glean some good information from a messy mix of truth and lies may lead us to read and internalize information that is false that we fail to separate properly.
That seems inevitable unless we're perfectly skillful at gleaning information from "messy" sources.
The solution is to demand that our sources be less "messy" by refusing to read and/or pay for overly "messy" sources.
Sure, that's sensible for individuals – I do that myself.
But that certainly doesn't seem like a general solution for everyone, especially if one cares what others believe.
I would take a different lesson from the examples you provide:
1. School shooting- All news portals would say that the same person killed the same number of people at the same school. There are very few variables, like the number of victims, name of the killer, etc, and the values of these variables are not open to interpretation. You cannot say that Abdullah looked like a John
2. Election malfeasance- The only direct variable involved is "Election fair=True/False". This variable is impossible to measure directly. Hence, either party is free to choose other variables that are indicative of the value of the direct variable. For example, Fox News might choose "trends in past elections in swing states" and say that time-honored trends were not followed in 2020, indicating that the election was not fair. Washington Post might contradict this analysis, and so on. It is only when direct variables cannot be measured, and we have to study indirect variables that we are free to choose in the manner of p-hacking, that news becomes open to interpretation.
I take your point about experts not willing to sign a petition with false claim. But this is manifestly untrue when the issues involved are political. Middle school education experts sign petitions with the claim that more funding poured into middle school education drastically improves the education outcomes of students independent of IQ. Also, Bill Clinton, the expert at having an affair with Monica Lewinsky, lied about his affair with Monica Lewinsky. Self-interest can muddy the waters significantly for experts even when talking about their own fields.
The number of people killed could vary because people initially counted as wounded later die, and articles don't automatically update.
I meant that different news portals reporting on the incident at the same time will report the same number of dead people.
> Bill Clinton, the expert at having an affair with Monica Lewinsky, lied about his affair with Monica Lewinsky
When giving evidence, they had a long discussion about what constituted "sexual intercourse", decided that blow jobs didn't count, and he then, truthfully, said that he never had sexual intercourse with her.
It is one of the classic examples of the non-lie lie.
Specifically, they gave Clinton a definition of "sexual relations" for the purposes of the question that was probably intended to include blowjobs, but had enough wiggle room in the exact wording for Clinton to parse it as meaning that she had sexual relations with him when she gave him blowjobs, but he did not have sexual relations with her.
This is great, and I plan to share this with friends.
I would be very happy if I felt that most blue tribe people would agree with this. But my experience is that no, they won’t. They get angry and mad if I say that the New York Times isn’t really reliable source because of biases.
If that is what you got from the article, "The NYT is unreliable because of biases," then I am sure your blue tribe friends will not agree with you.
[edit: This is basically equivalent to me saying, "Yes! I agree with the article. Red tribe people _are_ mostly incapable of logical thought. I hope my red tribe friends will finally see that I have been right all along."]
Do you mind sharing what you consider to be a reliable source?
Even sharing the following claims:
> There are lots of cases where you can’t trust the news!
> I believe that in some sense, the academic establishment will work to cover up facts that go against their political leanings.
Tends to get angry, dismissive responses from blue tribe friends. Any thoughts on why?
This obviously depends on the individual. My middle brother is Downs Syndrome. For him, a reliable source is his family. For me, NYT is fine. Fox would be fine too but a smaller percentage of the articles on NYT fill me with rage for being manipulative and/or disingenuous. There will never be an article in any publication that will define my understanding of an issue. They are just datapoints.
With this philosophy and a willingness to roll your eyes some percent of the time, I suspect you too could find useful data on NYT. Or is the issue that you assume that most NYT readers are sheep and being swayed to a political ideology you find repugnant? (by an occasional disingenuous article)
> Fox would be fine too but a smaller percentage of the articles on NYT fill me with rage for being manipulative and/or disingenuous.
Just this phrasing here is one that i think most of my blue tribe friends (and this includes a fair number of siblings) would categorically reject. I have 8 biological siblings. None of us voted for Trump. Two identify as moderates; the one that i think really is moderate is generally seen by the rest of us as conservative.
The biggest split between us, i think, is best defined in terms of either:
a) general distrust of all media sources as being, in your words "manipulative and/or disingenuous" (this is where i am)
or - and this is harder to exxpress because i'm still unclear on what it is
b) there are heavily biased sources (fox is an example, but so is, say, the huffington post or any political commentary show), and then there are sources which are more or less reliable
Those of us who think that corporate media is fundamentally unreliable blame it for getting trump elected. Whenever we try to bring these points up - that _all_ sources are biased and disingenuous, especially when it comes to their biases, i repeatedly encounter a rejection of the premise here. If I try to argue that, say, some fields have a general left wing bias, this gets rejected as well.
I would see the world very differently if i thought people saw scott's blog as the standard for journalism. I do consider scott's blog to be a reliable source of data, and what mean by this is, i don't feel the need to investigate to try and figure out what scott is lying about. This doesn't mean that i'm a sheep and will just believe whatever scott writes.
Your responses are antagonistic enough that i'm going to disengage here, unless your rhetoric changes. Sure, i could possibly get useful information from an exchange with someone how is aggressively antagonistic and not interested in understanding where i'm coming from, but why would i bother doing that - or reading sources which i expect to 'fill me with rage' - when i can just _not_ do those things?
My first reply may have come across as antagonistic. I may have misunderstood your original position. Possibly based on that you misread my second reply? It did not feel antagonistic to me. I certainly was considering myself as possibly equally guilty of the behavior I was describing.
On the other hand, I don't actually see what it is that you think we disagree about which is just as good a reason to end debate...
Cheers
This is partly a test. This post and the more recent one about poverty and EEGs won't load on Chrome. At first, they would load and then quickly switch to "too many requests". Now this one will load from Opera but not Chrome.
As for the topic, this isn't just about news source, it's also about cancel culture, both right and left, which are based on deciding that some source is completely disposable.
> This is partly a test. This post and the more recent one about poverty and EEGs won't load on Chrome. At first, they would load and then quickly switch to "too many requests". Now this one will load from Opera but not Chrome.
I'm having the same issue on Chrome, did not replicate on other Substacks. Easy enough to work around that I haven't dug deeper yet, but it appears to be affecting older posts on ACX as well. (But curiously, not the comments-only pages.)
I use Chrome, and I haven't had any problems.
Poking around a little more, disabling ACX Tweaks made the issue go away. Can't point to exactly what was causing the issue without further investigation, but I pinged Pycea about it.
"They don't talk about the "strong scientific consensus against immigrant criminality". They occasionally try to punish people who bring this up, but they won't call them "science deniers"."
The first statement is true, the second statement seems ~ false. When people bring up similar results they are often accused of "peddling pseudo-science", which seems functionally analogous to "science deniers." If someone asserts 'immigrants commit a disproportionately large amount of crime' they wouldn't be called "science deniers" only because it doesn't generally make sense to call someone a denier for asserting a positive claim.
Noun-ifying things that you hate does seem to be a common trend, though. You aren't just someone who partakes in a certain behavior or holds some belief X - you're an X-er, an anti-Xer, an X truther, an X believer, an X denier. It's a way to otherize.
I think the issue I have reading this is the impression I get that you, writing it, and everybody reading it, is going to think "Oh, I'm smart enough to have the correct level of distrust". It smells like just-world theory, only about intelligence instead of morality.
"But also: some people are better at this skill than I am. Journalists and people in the upper echelons of politics have honed it so finely that they stop noticing it’s a skill at all."
If it's a skill then there isn't a "right" level per se, though there can be 'good enough'. Scott's acknowledging he isn't flawless, though as always one should have less faith in readers than writers.
First, that's exactly what I'm talking about, when I say it smells like a just-world theory; if you get tricked by journalism, you just weren't skilled/smart enough, so really it's your own fault for not understanding what was going on. Skilled/smart people don't have this problem, it's the ignorant rubes.
But notice something: Nobody is going to think they are the ignorant rubes. This is a post which claims to illustrate something, but if you think about it, it's just telling people what they want to hear - that they have this skill that nobody talks about that lets them detect the Truth in Media.
Because, with exceptions not worth talking about for the purposes of this discussion, EVERYBODY applies a level of bounded distrust to every new agency; nobody (for the purposes of this discussion) think the news agencies are lying about whether or not it is currently raining in the city they're reporting from. Everybody thinks the media is lying to them about [insert opposing political/tribal belief here].
And Scott acknowledges there are certain times the media is fine with lying; there are things you cannot trust the media about at all (otherwise it wouldn't be necessary to clarify that the media doesn't lie about these particular kinds of things - it is fine with lying about other kinds of things), but there are other areas where they're careful. There are rules! You can understand the rules!
So somebody who is a climate skeptic can say "Well, this is one of the areas that the media lies about, the rules allow it here." Somebody who thinks the election was hacked can say "Well, this is one of the areas that the media lies about, the rules allow it here." Everybody can read this and think to themselves "I'm one of the people who understands the secret code of the media, I understand how and why they lie, and I can see into the truth of it." And think everybody who disagrees with them is an ignorant rube who (not exactly through fault of their own but also really if they were smarter or bothered to try to get better at this it wouldn't happen) is misled by the lies of the media and/or by their own paranoia about what the media is lying about and how and why.
(Not to even mention the fact that even if there were a set of intelligible rules in this sense that we could even agree on, as soon as they became public knowledge, they'd change, for roughly the same reason that you can't have common knowledge about how to beat the stock market.)
> But notice something: Nobody is going to think they are the ignorant rubes. This is a post which claims to illustrate something, but if you think about it, it's just telling people what they want to hear - that they have this skill that nobody talks about that lets them detect the Truth in Media.
Hard disagree - think about it as a skill like violin playing, and the answer is obvious. The vast majority of people are very poor violinists... and don't spend their time playing the violin. I know plenty of people who voluntarily decline to engage with any meaningful level of news journalism, who don't find it meaningfully blameworthy to not possess the media literacy to navigate the existing ecosystem. You just won't find them online, in the comments section, on a post on truth in media. The selection effects are obvious.
More to the point, I don't see anywhere where Scott makes even the implication that the reader of this piece is particularly skilled. Did I miss something, or is that a novel inference?
> (Not to even mention the fact that even if there were a set of intelligible rules in this sense that we could even agree on, as soon as they became public knowledge, they'd change, for roughly the same reason that you can't have common knowledge about how to beat the stock market.)
No. Anti-inductive behavior is recursive, most typically when predictions are fed back into the behavior of that which they predict. In contrast, editorial strategy is a balance between appealing to the readers' moment-to-moment interests in the short-term versus maintaining integrity for longer-term credibility. You might see a feedback loop of "improvement" when the tradeoff is not being made optimally (think movement towards a Pareto frontier), but this isn't the same as a motivated liar trying to scam a suspicious mark.
People realizing tabloids are trash will not stop tabloids from being trash, because they are not trying to be not-trash.
For the entire first section of your response, I'm satisfied with what I have already said on the subject, and see no need to continue that line of conversation.
For the second, if what you said were the case, we should expect to see "trust in news media" to be relatively stable; it's clearly not the case that editorial strategy is succeeding at maintaining integrity for longer-term credibility.
More, we have concrete evidence that editorial strategy has included policies which look an awful lot like "a motivated liar trying to scam a suspicious mark", in the form of previously-private message board conversations in which editors coordinated to lay out strategies of what to cover, and what not to cover, and how, for explicitly political purposes.
Given that editors are explicitly trying to direct public opinion, which requires successfully predicting behavior, then if people become aware of the strategies employed to predict their behavior, they will change their behavior in response. This is exactly the kind of situation which causes anti-inductive behavior.
> For the entire first section of your response, I'm satisfied with what I have already said on the subject, and see no need to continue that line of conversation.
I would actually like a response on where Scott implicated that the readership uniformly falls into the "savvy" category. IMO, the SSC comments sphere actually selected fairly strongly against media literacy compared to LW. That's probably an unfair comparison given different audiences, but I don't feel terrible about having high standards.
I'm comfortable breaking with Scott on this topic if necessary, but the article is pretty notably not written to either group in particular. Contrast how Conflict Vs. Mistake explicitly staked out one particular side both by Scott and the blog as a whole.
> For the second, if what you said were the case, we should expect to see "trust in news media" to be relatively stable; it's clearly not the case that editorial strategy is succeeding at maintaining integrity for longer-term credibility.
First: "news media" isn't an organization and doesn't have unified incentives. See also: approval ratings of Congress v. "my Congressman". This is a critical distinction if you find yourself trying to predict the actions of actors that don't actually exist.
Second: are you really comfortable assuming away structural changes in content delivery or audience preferences? The media landscape has seen nothing but a series of exogenous shocks going back at least four decades at this point, and I wouldn't have the faintest idea of how to begin controlling for that.
Third and with a hope of injecting some empiricism: how unstable do you think "trust in news media" *is*? Scott's commented on the Economist / YouGov polling on the topic before, and pulling in more recent 2020 data it looks like the change in weighted net trustworthiness since 2016 is NYT -1.5 less trustworthy, WaPo -0.5, WSJ +2, CNN -4, Fox +1.5, MSNBC -0.5. Given the healthy ~3% margin of error and the clear takeaway is that... nothing has notably changed? Is that what your model was predicting?
"While I've heard rare stories of the media jumping in too early to identify a suspect, "the police have apprehended" seems like a pretty objective statement."
The irony is that this literally happened last week with Malik Faisal Akram in Texas.
I understand the point being made, but I think part of our current problem (with vaccines) has to do with how much trust the pharmaceutical industry has burned in the past 25-30 years. There are some few people who won't get the shot as a political marker, but there are others that look back to the opioid epidemic and the hand-in-glove relationship with our regulators and the hair raises on the back of their neck.
We live in a world where there is no universally recognized truth Pope who can bless stories as being basically correct. Without that no one is smart enough or has enough time to review everything in enough detail to be sure it’s right. I know you just said that, but… god it’s depressing sometimes.
It’s reprehensible that so many people in media use their positions for political gains and obscure the facts. It’s a shame that it’s become the norm.
I would be less concerned if I didn’t also have the sense that they’d also deluded themselves. I can never get over the Twitter variant of: “I talked with my three year old today about complex geopolitical issues and they were immediately able to comprehend all the nuances and agree with my political opinions!” Which on its face you know never happened and yet my guess is that any person tweeting that would be able to pass a lie detector test on pure power of will. I don’t think it’s as bad as we fear but I also think it’s a situation where not that many people have to be bad actors before the entire lake is polluted.
That norm is far older than homo sapiens. Not entirely a bad thing either, if you're a fan of the Machiavellian intelligence hypothesis.
For what it's worth, this is the first time I've seen mistrust of vaccines connected to the opioid epidemic.
For all of the ink that’s been spilled about Rogan’s medical misinformation, it’s a point he returns to fairly often. A lot has been made of the Robert Malone episode, but the John Ambramson episode immediately preceding it was much better content in this regard. (Abramson, for what it’s worth, is a proponent of the vaccines)
My throughline on mistrust runs backwards from Covid vaccines are safe/masks don't work etc back to opioids are awesome, picks up some SSRI boosterism/side effect denialism, continues back to cigarettes don't cause cancer, pesticides are safe, and radiation is good for you. Also, that groundwater there is not contaminated and you'll be fine.
If it were not for the threat of how sick Covid could make me, I would have passed on the mRNA vaccine (of which I've had three) because I do think corporate-sponsored science is absolutely famous for saying "it's totally safe" not just when they don't know enough to say that but when they have actual evidence to the contrary that they're actively suppressing.
But that's just me and I'm not particularly proud of it. I don't love conspiracy theories. I do think corporations and the politicians that work for them have shown a lot of willingness over generations to lie and downplay harms that only become evident years later when the evidence is no longer deniable, but loads of money has been made meantime. And also that they're willing to factor in a lot of acceptable losses, so that their idea of safety isn't my idea of safety always.
[thank you for letting me insert this rant here; I feel better now]
Oh and that big plane there we've redesigned in a hurry under a narrow profit margin while regulating ourselves with no oversight, it's totally safe.
But meantime, if you smoke pot, you're going to become a drug addict and that magic mushroom/LSD stuff is definitely going to kill you, even though the alcohol your family is drowning in is perfectly fine.
I don’t think mistrust of pharmaceutical corporations can be described as a conspiracy theory. It was the uncontroversial norm as of 2019. In fact, Pfizer’s public perception of trustworthiness flipped from being in the bottom 10% to the top 10% within a year. There are plenty of documented reasons to cast suspicion on these guys.
I, however, am no also vaccinated.
I agree with you. I just meant dispositionally I'm not drawn to conspiracy theorizing AND I still have a huge amount of mistrust of a specific slice of things sometimes associated with conspiracy theorizing. Of course in my case, I feel like it's evidence-based, my mistrust.
Of course it's a conspiracy theory. It's a theory that people worked together to commit harmful acts, and didn't tell others about it.
This recent amalgamation of conspiracy theory being used as "people conspiring with each other, one of the most fundamental human behaviors" and "a crazy theory that only wackos believe that is definitely false" is very disturbing. It implicitly suggests that no one ever conspires, and theories about people conspiring must be false, by definition.
It's almost like the people committing conspiracies would want this to happen...
Fair enough. I don’t think that’s recent though, that term was coined as a pejorative.
My theory is that aside from that people are being told to specifically distrust covid vaccines, the mistrust is built on personal bad experiences with the medical system, and that personal bad experiences are not part of the discussion because mentioning that might imply that the medical system should work on treating patients better.
On the other hand, I don't know whether there's less mistrust in countries with better functioning medical system.
There seems to be an agitating anti-Covid-vaxx population in most western countries, including those who I personally think have far better medical systems than we do. (I’m being an American by assuming you’re an American)
I can buy into your theory though, especially given the number of non vaccinated black Americans who have some historical reasons to give the medical system some side eye.
I tell you three times, this is not about Tuskegee. This about the doctors and nurses who ignored symptoms and ignored pain, and did so with great assurance, in the patient's own life, and in their social circle.
This article is probably the main reason I read Scott. I identified that he has this skill much better than I do, and would never intentionally state something he knows to be certainly false. If there is a chance of it being true or false, he used qualifying words or even percentage guesses. And he rates his guesses at least annually, to calibrate himself. Currently, I cannot name any other source that is both better at this skill than Scott, and this level of honest. If anyone else has suggestions, that would be interesting though.
+1 I would recommend nearly all of his blogrolls (at ACX and SSC) + some other substack-writers: Erik Hoel (got recommended by Scott, not a lot of posts, yet, but good; one of my favs: https://erikhoel.substack.com/p/publish-and-perish ) and Thomas Pueyo of "hammer and dance" fame https://unchartedterritories.tomaspueyo.com/ (he puts the more interesting half of his texts "for subsribers only" , but that is ok). But " I cannot name any other source that is both better at this skill than Scott, and this level of honest." And a better writer, I'd add.
I think maybe I'd trust Zvi about some stuff more than Scott but they're, in my mind, in at least 'the same ballpark': https://thezvi.substack.com/
You made similar point in another post: “it’s not that bad if experts get things about Covid right two weeks later than MTG players”. Yes they are like 2 weeks or 2 years (as with N95s) late on Covid. But on other topics they are decades late and not catching up. For example beliefs about education and signalling. You compared schools to child prisons, this is position comparable in its anti-expertise to denial of anthropogenic influence on global warming.
Two things: (1) Media bias isn't that hard to correct for, for people who have an interest in doing so. But the supply of biased stories is created by a demand for bias. Most people have no desire to correct for bias. The reading skills and thinking skills you discuss here (as with the Lincoln-Marx example and the government harvest prediction) are not nearly as hard as you make out for people motivated to suss out bias. (2) I'm usually more concerned about ignorance in news stories, which may or may not be filtered through bias in the experts chosen to ornament a story. My standard method is to pick stories in some given outlet which are about something in which you know more than the journalists. They almost always get it wrong. That usually isn't bias (although they may be biased as well.) They just aren't trained in whatever the subject is. You can use the difference between what you already know and what they report as Bayesian prior evidence for future stories in which you have no specific domain knowledge.
Isn't that hard to correct for for whom exactly? I think a big part of the 'culture war' divide (and all kinds of similarly polarized beliefs) is that, for most people, epistemology is almost entirely social. I find it depressingly rare for anyone to even _attempt_ to understand anything at a 'gears' level – even otherwise smart people!
On the other hand, I think you're right that, given sufficient motivation, people are (perhaps surprisingly) 'good enough' at doing this in practice. I think there's a very strong 'selection bias' in thinking about these issues mostly for the most controversial subjects and not noticing that there's quite a lot of knowledge/info that people mostly-competently handle.
Yes, Gell-Mann Amnesia is real and, you can somewhat correct for it, especially after you've experienced it yourself first-hand.
I agree. I don't think there's any way to find truth for people who aren't motivated to to seek truth over confirmation. The market forces leading to confirmation are just too strong.
Great piece. The screening ability may be somewhat (ha ha) rarer than Scott suggests. Wonder if changes in education over the past 20 yrs have affected the prevalence of the skill. Did no child left behind reduce or enhance adults’ truth detecting intuition?
The AIER article on Lincoln & Marx was written by Philip Magness, who regularly mocks anyone who takes Chinese data on COVID deaths/cases seriously. In contrast, Greg Cochran (of "creepy oracular powers" fame) realized how seriously COVID was based on the Chinese government's reaction, and regularly mocks COVID skeptics (whom he's also won multiple bets against) for thinking that large numbers of dead bodies are the kind of thing that could be easily disguised.
OK--I don't love Phil Magness either; I think he's annoying. And I agree that China trades with the world enough that we can be pretty confident about the ballpark of their COVID death numbers. But this seems to be more or less shooting fish in a barrel. Is there any direct evidence that Lincoln had even heard of Marx? Marx was not very well known until (at the earliest!) Capital vol. 1 came out in 1867... 2 years after Lincoln was assassinated.
Maybe Lincoln saw a letter wishing him well, maybe he read some articles about the Sepoy Rebellion or whatever irrelevant matter by Marx in the New York Herald-Tribune. But what actual reason (besides Kevin Kruse-esque twitter induced brain damage) is there to suppose that Lincoln had any significant connection to Marx? The idea is, to be frank, just absurd on its face. Maybe only an annoying guy like Phil Magness is willing take the hit for saying that in the contemporary information environment, but that doesn't mean he's wrong.
Oh, I think Magness is correct about Lincoln/Marx. That's within his expertise as an historian. Contemporary Chinese death stats aren't within his expertise, and parsing Chicom pronouncements for truth vs falsehood is outside as well. Greg Cochran, on the other hand, not only has domain expertise in disease, he can also engage in "bounded distrust" of others. For example, he says one reason he knew there were no Iraqi WMDs is that he regularly read the NYT (even while their own Judith Miller was hyping the threat at the tine) and remembers what he read.
Deleted my previous posts after I thought about this a bit more. The belief that "media literacy" is a skill seems to rest on a flimsy assumption: That biased journalists/experts are writing in a secret code where 99% of people will read it and believe a lie, but the truly smart people will decode the article correctly and find the truth. There's no reason to believe this always holds, even if it holds sometimes. A journalist with a single stroke of a pen could change the article so that there is no way to get at the truth. An organization could watch people successfully "decoding" the articles on Twitter, and adjust their writing style so that using the same decoder "key" will uncover another lie. It does not seem worth it to try to engage people who are writing in bad faith in such a way out of a belief that somewhere in there is a tiny speck of good faith.
>A journalist with a single stroke of a pen could change the article so that there is no way to get at the truth.
Well, yes, the pen could be an extra-wide sharpie that they sweep across the whole line of text. No way you'll ever figure out what the underlying truth was. And yes, there are lesser forms of obfuscation, but the end result is the same - pure confusion, serving not even the reporter's private interests.
The point is, biased journalists want to create the *impression* that they are delivering useful and accurate information, because without that they might as well just scribble with the sharpie. But there are still rules and norms as to what they can and can not do in the pursuit of their possibly-nefarious goals, and see Dan Rather for what happens if you cross the line. So, in order to create the impression that they are delivering useful and accurate information, they have to include some actually true and accurate information, and if you know the rules you can pull some of that out of the muddle.
Concisely stated, the rule would be to always believe extremely objective statements. As a corollary, somebody isn't making objective statements (aka weasel words) just ignore them. Good heuristic if you can use it (sometimes weasel words are just too common).
While I generally agree with that, it means scientific frauds can go for awhile without being discovered.
Example: https://en.wikipedia.org/wiki/Sch%C3%B6n_scandal
People close to him knew he was a fraud (source: somebody who was close to him told me) but Bell Labs management didn't want to believe that and we typically accept the raw data from scientists as true; peer review is about checking methods but assumes good faith. Schon was caught because he had reused figures in ways that couldn't possibly be anything but fraud. If he had been less lazy he might not have been caught. Makes one wonder how many uncaught frauds are out there.
It would be easy to write this off as an outlier, except outliers can have extreme effects.
The analogy to news might be using photos at the top of an article which are true but wildly misleading, something which happens all the time. Stock photos to set the scene are harmless, but often people cross that line too.
Really good documentary about Schön: https://www.youtube.com/watch?v=nfDoml-Db64
I think this is a true and important post.
One extension I would like to add is this also applies to other matters like trusting governments (ie China). I was surprised to learn over the last few years a lot of people lack the skill to appreciate the scope and types of things the Chinese government can lie about.
Something else I want to add because I'm still upset about it.
Three years ago, Jeff Bezos publicly accused the Saudi government of hacking his phone when he knew this too be false. A 100% baseless claim. I only highlight this because I thought it was quite sad to observe that Jeff Bezos thought this was an acceptable thing to do and nobody seemed to care he did it.
To the point of this article, 100% outright falsehoods are really bad and those who commit them should be shunned.
Why do you think Jeff Bezos knew his claim that the Saudis were out to get him (rather than his quasi-brother-in-law) was false?
My impression is that major player insiders like Bezos, Hillary, and Trump tend to be conspiracy theorists. Disdain for conspiracy theorism tends to be most common among upper middle class small-timers.
I think Scott’s core point is trivially true—there is some core of objective fact to most (not all) media accounts that very few journalists or experts would actually lie about—but the larger post seems like a bit of a motte-and-bailey argument. Those of us who are radical media skeptics don’t take issue with that proposition in theory, we just think that category is much smaller and far less significant than the post implies, and that what gets reported in the first instance and what does not is carefully curated, and that selective reporting and omission of important context eliminates any usefulness of the media in all but the lowest common denominator sense (i.e. I believe that if the media reports a demonstration in my part of town, I can reliably predict increased traffic and logistical difficulty in that area). Scott’s post suggests it’s possible for the sophisticated to extract more useful signal than that; I think that’s cope by those who don’t want to acknowledge how bad the situation is or are concerned about the consequences of enough people thinking like that. But that’s, like, just my opinion, man.
The discussion of particulars in the comments here seems to kind of miss the point a little. The real issue is how much you should update your views based on even true information provided by those with the ability, incentive, and stated intention to selectively present such information to you. I think the answer is “not very much” and that goes to zero if a particular event is already in your “this happens sometimes” category.
One problem in 'hot debates' is that one side of journalists can simply refuse to make certain facts or nuances either in general or at least until the damage is done.
If you're trying to argue someone out of a misapprehension caused by lousy reporting you might be stuck resorting to sources that the person is primed to reject categorically on grounds of bias.
Note that the sophisticated tend to "extract useful signals" that happen to agree with the beliefs they already have. Other people who engage in the same kind of reading of tea leaves and come to disagreeable conclusions are being either gullible or conspiracy theorists; only the exact right level of distrust, which everybody thinks they have, nets you actually useful information.
"Scott’s post suggests it’s possible for the sophisticated to extract more useful signal than that; I think that’s cope by those who don’t want to acknowledge how bad the situation is"
I extract useful signals from the New York Times and Washington Post every day.
There's a set of folks that I'll just call "epistemic institutionalists" [I've heard the term intellectual authoritarian used but the 'A' word has negative connotations]
i.e. the idea that the responsible thing is to simply teach someone to be capable of identifying the institution that promulgates a fact/set of facts/narrative and either trust or dismiss what is said without actually sifting through the contents for value.
Now, the kind of skill that Scott is describing is basically that of an extremely high reading level which is at times combined with varying degrees of statistical literacy. There's also implicitly a certain temperament (emotional detachment) required but let's waive that for a moment.
The idea that under even the most ideal circumstances a huge portion of the population is not going to attain a particularly advanced reading level and/or mathematical aptitude, (perhaps because some underlying physiological trait that can't be significantly enhanced through environmental or medical stimulus) Is *extremely* taboo with the aforementioned institutionalists.
If you're an institutionalist you more or less have custody of the youth's instruction from the ages of 7 to 18 and beyond. If you decide what you'll do with that time is to drill your students in trusting and dismissing sources out of hand you're more or less operating on the assumption that the vast majority will never attain the level of skill needed to do what Scott describes.
Perhaps experts don't lie about immigrants committing more crimes than natives, but I think they come pretty close. E.g. in https://www.svt.se/nyheter/inrikes/kriminologen-jerzy-sarnecki-las-in-unga-valdsbrottslingar-lange Sweden's most prominent (in mainstream media) criminologist says that immigration has not increased the amount of violent crime in Sweden. It is very difficult for me to believe that this expert truly believes that in the absence of large-scale migration of exotic peoples, Sweden's gang-rape statistics would have looked the same.
It's pretty common for experts to make statements that naive nice people interpret wrongly. For example, lots of nice people believe the "Girl With the Dragon Tattoo" myth that the rape in Sweden problem is overwhelmingly due to neo-Nazis.
https://www.takimag.com/article/fight_the_imaginary_power/#ixzz1hnodDydT
As Scott has more or less pointed out, the race/ethnicity-crime correlation is a massive unwelcome problem even for this website to deal with. There are some facts that are too factual to be stated.
This is, more or less, how I learned about covid in November 2019. I specifically follow right wing news aggregators because I see things I don’t see in left wing/mainstream news aggregators. Enough of it checked out and the way they presented it wasn’t the way I would have expected them to present it if it was entirely fiction.
COVID was being reported on in left-wing "prestige" publications as well at that time (though it was presented as a domestic issue in China). Perhaps the main issue is more that media outlets for "regular people", like cable news, simply operate differently and assumed there was no way to talk about COVID without people freaking out (and perhaps they were right, given their experience with turning ebola into a national panic despite it never having been a threat to Americans whatsoever)
Fair enough. I think it’s a question of volume/loudness. I’m certain you could point to any large group of people and find an early warning signal. I suppose it’s a measure of institutional effectiveness how quickly that signal made it to the top/official mouth/place of prominence. For me people who were otherwise, to use a technical term, batshit insane about things like Trump were entirely credulous COVID reporters as a group. I’m extremely distrustful of mainstream press (to the point I try to adjust my own internal thinking because I know my initial reaction is going to be too harsh) and while I wish your statement about not wanting to cause a panic are accurate I don’t think they have those kinds of assessments. I just try to stand far back on a hill and watch everyone fight and then glean who is telling the truth based on something like troop movements. That’s what allowed me to notice it early.
“ Perhaps the main issue is more that media outlets for "regular people", like cable news, simply operate differently and assumed there was no way to talk about COVID without people freaking out”
It seems a bit more troublesome than that, since cable news has, since March 2020, been as responsible as anyone else for talking about COVID in a way deliberately designed to freak people out.
> There are lines they'll cross, and other lines they won't cross.
Good post, but I think the real is that they're crossing that line because it confers some advantage to them, and that's likely because they *know* some people will misinterpret it. That's clear deception no matter how you slice it, and therefore we have to ask ourselves whether we should tolerate line crossing at all.
For anyone looking for more examples of this reading-between-the-lines process, it is sometimes referred to metaphorically as Kremlinology.
I don't know if links will get blocked, but for anyone else looking for the a link to the study (rather than a news source's review of the study) it is here: https://www-tandfonline-com.translate.goog/doi/full/10.1080/20961790.2020.1868681?_x_tr_sl=sv&_x_tr_tl=en&_x_tr_hl=en-GB&_x_tr_pto=sc
If I were to try and summarize it I'd say - they used statistical tools that don't start with a hypothesis - so immigration nor anything else was a "particular focus" (although presumably they assumed that some of the data they'd fed into the model would turn out to be relevant). However after the statistical tools were run, in their own abstract they say that a "key point" is "The majority of those convicted of rape are immigrants."
> What’s the flipped version of this scenario for the other political tribe? Here’s a Washington Post article saying that Abraham Lincoln was friends with Karl Marx and admired his socialist theories.
A better comp to the Fox scenario you describe would be just over a week ago several liberal outlets describing the Texas synagogue hostage taker as a "British man" and then covering a press conference where some fed said something to the effect "doesn't appear this had anything to do with Jewish people." Ok then!
> The 2020 election got massive scrutiny from every major institution.
The 2020 election was highly irregular and far from receiving scrutiny, every major institution continues to refer to it as the freest and fairest election in our nation's history. Including evil Fox News. Doesn't mean it was rigged but it received anything but "massive scrutiny."
> They occasionally try to punish people who bring this up, but they won't call them "science deniers".
Too good. "You may violate our women, suppress dissent, and make a mockery of any concept of democratic governance. But don't you dare call use science deniers." How to control a rationalist with this one easy trick!
> The reason why there’s no giant petition signed by every respectable criminologist and criminological organization saying Swedish immigrants don’t commit more violent crime than natives is because experts aren’t quite biased enough to sign a transparently false statement..
Why bother signing a transparently false statement when the government can just bring "misconduct" charges against them (only when results lead in a certain direction, as you point out)? As for the climate change example: a global phenomenon with global funding sources. No one government can silence dissenting voices so it is left to deep pockets to offer the carrot. Not saying the climatologist letter is wrong or a lie, but it's a poor comparable for the Swedish immigration study
This was a lot of words to sorta, kinda admit you were wrong about ivermectin. And all the reading of tea leaves and "sensing of the dynamics" does not account for an obviously coordinated effort to suppress and discredit a generic drug that at the very worst does no harm. From a bizarre media blitz of calling it "horse dewormer" to doctors losing their medical license for prescribing it and pharmacies refusing to fill it. This is unexplainable behavior, and so, they don't explain it. The harvest was meh this year, comrade. Pay no attention to those Golden Arches. Only horses eat there.
Speaking of bias, you can misrepresent a situation while doing nothing but presenting the truth. Imagine, for example, that FOX News took it on to report *every* case of a major crime committed by an illegal immigrant. They could have teams of investigators on every case, outclassing the police, reporting nothing that wasn't sourced to objective evidence or at least 3 separate, credible eye-witnesses. And it would still be a misrepresentation because it was placing undue emphasis on one group of people.
That's an extremely general issue with any kind of reporting, i.e. a form of 'selection bias'.
As always, thank you for your thoughtful consideration of important and complicated issues. I have a quibble:
"I think some people are able to figure out these rules and feel comfortable with them, and other people can’t and end up as conspiracy theorists."
That's a false binary; you present it as 'well-adjusted people who understand nuance vs q-anon nutters' when there are at least three categories. I'd argue that the people who blithely believe Fauci is a hero who Embodies Science and buy t-shirts and devotional candles with his likeness do MORE damage than people who think Fauci engineered covid so that Bill Gates could distribute his 5G chips. On the other side of the spectrum from Alex Jones are people who ALSO aren't able to 'figure out these rules and feel comfortable with them' ... rather than misinterpret the meaning of the game, they *fail to see the game at all*, and then fall victim to the same tribalist comfort of unthinkingly belonging to a team.
The middle, rational position, isn't to 'feel comfortable' with the game - that's way too much like not seeing it at all - but to simply realize that everyone, to some degree or other, is lying to you, *all the time*, and act accordingly.
I think this piece overstates the importance and implications of "FOX wouldn't make up the fact that there was an act of Islamic terrorism". Let's say that's true. But still, if they choose to report on 30 true acts of Islamic terrorism and choose not to report on 300 true acts of non-Islamic terrorism, what meaningful knowledge do you gain from correctly understanding that they wouldn't make up the 30 that they did report on? Same thing on the left, if CNN reports that there were 24 new laws that limit people's ability to vote, and they wouldn't and didn't make that up, but they also don't mention that there were 50 new laws that enhance people's ability to vote, what important thing have you learned from correctly understanding that they wouldn't outright lie about the 24?
Yep, I think that most of what is broadcast as national-level "news" is worse than useless for most people, and its undue prominence contributes to many problems. But such is the unfortunate reality of human brain than a dozen deaths (if dramatically presented, even better) is a tragedy and a million is a statistic, and in a fair competition for attention tragedy beats statistic every time.
Yes. Lying by omission is a thing, right?
A thing protected by first amendment, yes.
The First Amendment doesn't protect obscenity.
See, the comment I just made is completely true and also completely irrelevant.
There's another possibility: you think you understand the rules of the game but you don't really. I think this could well put you in the worst position of all. To be specific, I find it very hard to square the early statements on the possibility of a lab leak with the rule that scientists won't "flatly assert a clear specific fact which isn’t true".
As I take it that's not so much a hard rule as just an event which is very unlikely. Anyone will lie if they think there's a big enough benefit in it, even scientists. In the particular case of lab leaks and related matters I guess by coincidence both American and Chinese authorities found the possibility embarrassing and applied pressure to suppress the idea. This interacted in a strengthening manner with the ideology of Science™ among the usual suspects in the media, who were all too eager to denounce speculation as "mere craziness".
I agree, I think all the rules have fuzzy edges and the exact locations of the edges can change over time. But that makes it even *more* important not to trick yourself into trusting rules that don't actually apply.
Well, I would like to clarify a bit that what Scott meant was really a heuristic rather than a hard rule. Most of the time a scientist still won't make clear specific statements that are false so it's a good heuristic, since heuristic methods are not expected to work 100% of the time. It still feels easier to make use of relying on experts as a heuristic, while keeping track of exceptions such as "the scientist who said P lives in an authoritarian country & said country would find ~P embarrassing", since the alternative would be too cumbersome (I certainly don't have time, funding, or capability to inspect everything to my personal satisfaction, do you?).
Some suggested 2nd-order heuristics ("heuristics about heuristics") for when a scientist might say false stuff (there are probably more, feel free to suggest):
- the scientist might be pressured by an authoritarian country who finds the truth embarrassing, as discussed before
- there is some temporary crisis which could be worsened by public hysteria and authorities might credibly make use of the scientist's social status and esteem to minimize disorder & panic
- the specific false stuff is from a "cute" study widely reported in general media, and the scientist is currently holding a TED talk or being interviewed
- you may be a startup-er and the scientist is a stakeholder in one of your competitors
- the scientist is named "Euler" and you're an atheist hanging out at the Russian imperial court
The AEIR rebuttal has its share of seemingly calculated half-truths and insinuations. For example, "But Marx’s articles for [the New York Tribune] consisted of brief news summaries about the Crimean War, continental European politics, and piles of dry filler material about annual crop yields and industry reports. Only a small minority of these works ventured into something resembling a cohesive Marxian economic theory"—that last part is unsurprisingly true, since there wasn't yet such a thing—but his column seems characteristically to me unusually passionate and ideological. The author even generously links to those articles. Did he not expect anyone to check?
I feel there are two orthogonal layers here.
A) Status signalling
B) Deceiving v truth
Generally deceipt as opposed to lying because they technically don't lie but still with the mens rea of giving people a false impression.
Most comments seem to focus on the deception part, but I don't feel it is the main driver here. It mainly seems to be a complex game where high status try and trip up low status people, so they can laugh at them as conspiracy theorists or people who don't understand the elite terminology.
The B) section is quite blatant once you penetrate , the Swedish government and Marxists are actually trying deceive for simple political purposes and that is just normal politics.
IMO Section II should have referenced Russian Collusion. This was incredibly corrosive to the nation's political discourse, was relentlessly pushed by WaPo for years, and was false.
I had a similar thought the other day, reading a tweet from Jesse Singal that he had "absolutely no fucking clue who to believe about anything Omicron-related" due to the public health officials having "beclowned" themselves.
A binary "to believe or not to believe" is a naive question. Instead, the question is "what information can I extract from this?" Public health officials tend to be paternalistic consequentialists (e.g. saying what they think we need to hear rather than what is the most truthful), will get more roasted for being wrong in one direction versus the other, and that they, like other humans, have an inflated sense of their own importance, virtue and correctness.
Through that lens, I understand (more or less) why public health officials have exaggerated certain risks (e.g. outdoor transmission), overstated the benefits of certain interventions (e.g. masks), flatly denied that there is evidence of efficacy for interventions with conflicting, generally positive but low-clinical-significance effect estimates (e.g. ivermectin), and have been painfully slow to update their guidance in the face of new evidence.
If a public health official says "yes some evidence suggests ivermectin has some efficacy, but if you're so damn interested in an efficacious intervention go get the god damned vaccine", some people will only hear "ivermectin is effective." Based on that statement, perhaps 1000 people who might have gotten the vaccine won't, thinking that ivermectin will ensure their health if they get Covid. More saliently, they open themselves up to the criticism of their paternalistic public health peers. So of course they don't want to say that. Is that irritating? Yes. Is it the right move? Not sure... it's hard/impossible to predict the consequences of people hearing "ivermectin is effective" vs the consequences of "yet another misleading statement from public health" if the evidence of efficacy is denied.
That isn't to get them off the hook. I dislike the paternalism of healthcare generally, as it negatively affects me personally and I suspect is an overall "inadequate equilibria" (which is to say, there is a better way). I also get why people distrust public health officials. But what I don't get is Jesse (a smart and perceptive dude) having "no fucking clue" who or what to believe. Or rather, I don't think his problem is actually epistemic - he *does* have a fucking clue. He's really just stating his objection to misleading statements. I am sympathetic, though ideally he wouldn't be broadcasting "we can't know what to believe!" when, IMO, we have enough information to triangulate on probable truths.
This is also reveals why many people should not trust public health authorities. Even if we are being generous and assume they are acting like utilitarians, they will still support policies that will kill 99,000 but save 100,000.
And if the 99,000 are overwhelmingly part of a certain group (e.g. young males), then that group is right not to trust them.
Regarding "the fake news that falsely claimed that Saddam Hussein was undertaking a major weapons of mass destruction program", then what exactly was it which in 1981 —in Operation Osirak— that the Israeli Air Force destroyed literally days before it was about to go critical (i.e., the nuclear reactor cores fired up, after which the fallout released from their destruction would be potentially devastating to the Iraqi civilian population)? Yet another totally Innocent aspirin + baby food factory?
I took the Weapons of Mass Destruction claim to refer to weapons that were functional at the time the claim was made, or were being built at that time.
> I’m not blaming the second type of person. Figuring-out-the-rules-of-the-game is a hard skill, not everybody has it. If you don’t have it, then universal distrust might be a safer strategy than universal credulity.
This is where I have a failure of empathy. Figuring-out-the-rules-of-the-game is an obviously important skill, and growing up I had it as part of the school curriculum three times before I was twelve. Sure, there's a long list of critical details about how journalistic sausage is made (ex: headlines are written by other people, Opinion articles exist in a different universe from fact-checkers, "editor" as a job title is meaningless, etc.), but at a base level I don't understand how someone lasts a decade on the internet without learning to parse articles for *what is actually being claimed*. Not the selling point, not the impressions, not the feeling it tries to leave you with, but the factual information it attempts to convey. (Or just as importantly - the lack of any such.)
I'm not up to writing it all out right now, but I'll caution against portraying this as a one-dimensional binary: the opposite of the savvy reader is not merely unskilled at parsing truth, but also actively disengaged. The epistemically dangerous territory is where people get burned by not understanding the rules, and instead turn to areas where not even those rules apply. (Yes, I'm talking about social media. Comments sections very much included.)
The combination of inability and lack of give-a-shit to:
1) read sources critically
2) understand the difference between what is said and what is meant
3) differentiate between intended and likely consequences
4) understand that not everything has a clear villain and hero
These are all things that are really, really common among the average to low IQ crowd that I think the higher IQ people who can and do engage in these behaviors have a really hard time modeling.
I think I want to double down as having a lack of empathy for insufficient give-a-shit rather than not understanding a lack of skill. I know *why* Joe Blow just reposted an article, EChoing Twitter commentary that's cleanly contradicted by the first paragraph. An assumption of charity doesn't stretch to the idea that he was haplessly duped by his previously-reliable friend who sent him the article, he just didn't *care* about its content beyond the value of signal-boosting the latest CW. That's probably in the same universe as the uncanny reader, but "I was fooled by the headline" is a shit excuse and blaming lying journalists rings hollow when literal children know better.
Are you using "clueless" and "savvy" in the sense of the Gervais principle? Maps pretty well.
The response to this is many things. But one of the responses to this is that you assume that the liars do the same quality of lying all the time. It may be that politicians are willing to lie about tax increases by calling them something else, but not willing to lie by completely denying them. But it *also* may be that their willingness to lie varies depending on political whims. Remember "read my lips, no new taxes"? That was a more direct lie than just saying that taxes are revenue enhancements. But that wasn't how Bush typically lied, and indeed when he raised taxes, he actually claimed that he hadn't lied because the new taxes were revenue enhancements. He just switched from an unbounded lie to a bounded lie, while lying about the same thing, and using one of your own examples.
>The reason why there’s no giant petition signed by every respectable criminologist and criminological organization saying Swedish immigrants don’t commit more violent crime than natives is because experts aren’t quite biased enough to sign a transparently false statement... And that suggests to me that the fact that there is a petition like that signed by climatologists on anthropogenic global warming suggests that this position is actually true.
This is only true for trivial reasons: If the media or politicians believe they can get away with telling a lie, it can't, by definition, be transparently false.
But I'm pretty sure even you can think of cases where prominent officials or media came out in favor of something that was false, and known to be false at the time. You can look at a lot of statements about COVID for examples.
This doesn't seem to touch the thing that I assume that FOX News does to actually mislead people (I don't have FOX News, but I'm extrapolating from how the Sun and Daily Mail behave), which is that if there were over the course of the year 10 mass shootings, 9 by people with nice white sounding name, and 1 by Abdullah Abdul, they would not report the first 9 at all, and fill the airwaves for months with speculation about Abdullah's terrorism.
And "liberal" media TOTALLY does this lying-by-omission thing. There are lots of stories that I think are interesting and important that The Guardian doesn't cover because they reflect badly on "our" side.
[I really can't stand the tribalism of it all, hence the scare quotes, it's easier to use these reductive labels in a discussion but etc etc etc ]
Do you actually read the Sun or the Mail? I just ask because I've never seen them fail to report an important news event (and I try to look at all the front pages if I can*), regardless of the perpetrator. It's because their readership is probably a lot more interested in the event, the victims, the perpetrators, than it is in having a racist agenda pushed on it. So any sort of mass murder against the general public is front-page news, because it sells papers/brings in visitors. Their comment sections might rightly or wrongly focus on the tenth shooter, but even there that might be legitimate: to use a not-exactly-hypothetical example, if that shooter were a failed asylum seeker who had not been deported, should that not be a subject to be discussed despite the other 90% of recent mass shooters being locally born and bred? If you can cut mass shootings by 10% through immigration policy, should this not be discussed?
I suspect Fox is the same here: I've not heard that it fails to report major things (it seems to have reported that President Biden won the election for example), but it's editorial line may choose to emphasise some things over others for analysis and discussion.
As a side note, the Guardian has seemingly shifted it's editorial stance if late. You're actually not only getting stories about the push back to the cancelling of Kate Clanchy, but even opinion pieces in support of her, which suggests they think their readership are not militantly woke.
* If you'd said the Express ignored nine shootings by people with nice white names (my middle class upbringing suggests this excludes Waynes, Bradleys, Traceys etc) to talk about Diana or predict unusually bad weather instead, I'd have believed you on this basis mind you!
The difference is not in which event Fox News will be talking about the day it happens - all 10 shootings will get covered. But on which one they’ll still be talking about a week later (flip as needed for other outlets).
I mean, we saw this sort of thing very clearly with the synagogue hostage situation - some outlets were careful to only report on the perpetrator as a “British man” and were very credulous about claims that the crime was not targeted at Jews. But EVERYONE covered it.
Everyone covered it, and everyone forgot about it very quickly. If the perp had been a white supremacist then undoubtedly it would still be the top story on both CNN and Fox News, but he wasn't so we'll never hear about it again.
Agreed that what each outlet chooses to focus on after an event varies. That may not be a bad thing though, since all outlets will report the shootings, any arrests, the start of trials and convictions. This means that although the interpretation may skew towards particular views I think Scott is right to state there's a basic level of trust around the facts that can be shared by all news providers in your average democracy, whilst there are spaces in which different viewpoints can be examined. Whilst we might favour opinion pieces that blame the shootings on class repression, racism, excessive liberalisation of the law enforcement agencies, or lack of proper religion nowadays, depending on taste, and we have outlets that let us explore our preferences, at least we're confident that it's only at this second level that the basis for discussion changes. We are at least all discussing the same basic facts. In fact that's probably a reasonably good rule of thumb for conspiracy theories: if they are claiming all news outlets are ignoring something majorly newsworthy then it's probably (>95%) a conspiracy theory.
To be fair, the NYT and the WaPo are not exactly Fox News, and they too were parroting "unnamed highly placed sources" who insisted that Iraq was chock-a-block with WMDs, and later parroted other conspiracy theories in support of Empire.
I have similar contempt for them that I have for Julius Streicher and Alfred Rosenberg. And Fox, for the record.
I enjoyed the article, but I'm not quite sure of what the take-aways are, other than: "Don't believe everything you hear. Don't disbelieve everything you hear. Keep working on improving your map." Which is just par.
I tend to lean very hard on the "trusting the establishment" heuristic. Needless to say, it makes me a bit of a pariah in many circles. If we had a regime where more people actually read articles rather than headlines and at least skimmed the Wikipedia article on a subject before commenting on it, discussions on politics would improve exponentially.
On the other hand, I struggle with how much of this is because I might have skills or heuristics that other people just fundamentally don't have. And furthermore, where those heuristics come from. Did I learn them in school and could it be improved though education, or is it just an individual quirk of me and people like me?
I do remember that. At most though, it was like a one-day unit in freshman high school English. And this was an upper-middle-class high school in a major metropolitan area, for reference. Could also be a function of high reading/writing comprehension.
Perhaps if Philosophy was part of the standard grade school curricula, there would be more like you.
"I tend to lean very hard on the "trusting the establishment" heuristic."
"at least skimmed the Wikipedia article on a subject before commenting on it"
This is very revealing.
Wikipedia's entire schtick is to act like a neutral aggregator of information, and then systematically censor all non-establishment sources and opinions.
https://thecritic.co.uk/the-left-wing-bias-of-wikipedia/ - this is a good compilation.
https://en.wikipedia.org/wiki/Google_and_Wikipedia - from Wikipedia themselves! The "CIA World Factbook" was my favorite part. Talk about establishment!
> I'm a liberal who doesn't trust FOX News, and sure, I believe it. The level on which FOX News is bad isn't the level where they invent mass shootings that never happened. They wouldn't use deepfakes or staged actors to fake something and then call it "live footage"
How do you know?
I am not asserting that this is true. I am not asserting that Fox news, or any other news outlet, does this.
What I am asserting is: _**IF**_ they did this, you would not be able to know.
So I am curious: What is the basis of your epistemology such that you can say with confidence that Fox News wouldn't make up video coverage of a shooting?
(And, lest you forget, I will remind you that similar falsifications have been documented repeatedly across all news agencies. https://www.nytimes.com/2019/10/14/business/media/turkey-syria-kentucky-gun-range.html is the most recent example I remember. Story: SPOOKY TURKISH TERRORISTS SHOOTING UP EVERYTHING. Video: a gun show in Kentucky. Why are you so certain that Fox would _never_ do this, when ABC did exactly the thing you are describing, three years ago?)
That last parenthetical seems like a very different scenario. Using the wrong footage (whether through incompetence, malice or just laziness) to illustrate an actual story is one thing.
But the reason I trust a big news network not to fake a whole mass shooting this is simply that it doesn't make sense for anyone's incentives to do so. Faking a whole live news story in the middle of New York is a huge undertaking with a budget of millions and hundreds of staff who would need to work on it. Even if it goes perfectly and nobody ever blabs and none of the other news networks figures out your subterfuge and nobody ever notices that all the victims were paid actors, the maximum benefit is rather small and the risk is enormous. What's the point? It's a lot easier to cover the real news than it is to stage totally fake events.
Of course in practice the left has never accused the right of faking a mass shooting, it's the other way around (Sandy Hook). It seems equally implausible in this case for the same reasons.
> Even if I learned of one case of them doing something like this once, I would think "wow that's crazy" and still not update to believing they did it all the time.
Well this is interesting and probably highlights a fundamental difference in epistemology between you and I
I assume everything is always at equilibrium, unless actively disrupted. That means, if I catch the news doing this _one time_, and I can't point at a unique and specific explanation for it, I will assume it's been happening all along and I only just noticed.
This seems so obviously correct to me that I'm curious as to how you can convince yourself that "oh it just happened once"
Your view would be correct under the assumption that your ability to notice things is low, and the variance in the news's untruthfulness is low. If you adjust those assumptions, it becomes reasonable to believe a single case is actually just an outlier.
> Or "FOX is against gun control, so if it was a white gun owner who did this shooting they would want to change the identity so it sounded like a Saudi terrorist".
They do this all the time, +/- a technicality.
Every news outlet aggressively suppresses demographic data when reporting on crime, if and only if that demographic data is not aligned with their narrative. See, for example, Coulter's Law. Sure, it's not technically lying, because they didn't present a false identification. But selectively hiding the identification when it's editorially convenient is no different
It is different because the not-technically-lying bit leaves you with *some* bits of accurate information. If they tell you that the shooter was white, you have high-confidence information that the shooter was white. If they don't mention the race of the shooter but do mention the race of the victim, then by Coulter's law you may have moderate confidence that the shooter was nonwhite. If the reporters were actually lying outright about that sort of fact, then you'd have *zero* bits of accurate information - CNN would always tell you that the shooter was a white Trumpist and Fox would always tell you he was a Saudi terrorist or whatever, and you'd have no way of knowing.
Unless you like leaving potentially useful bits of information lying around unclaimed, you might want to make use of the fact that reporters are generally "not technically lying" even if they do frequently omit things.
One additional complexity is that this heuristic of distrust is not symmetrical. What you say about FOX is true, but you couldn't turn that around and apply it to the New York Times. The two sides of the political aisle have different epistemologies.
The Right twists the truth because they see themselves as working in service of the Truth, or at least America. A signifiant contingent on the Left have disavowed both Truth and America. In their value system, truth claims are only a method for subverting some power structure or other, and are judged not primarily by accuracy but by how well they support whatever progressive narrative is in vogue.
This is turning into a fisking. I apologize
> There are lines you can cross, and all that will happen is a bunch of people who complain about you all the time anyway will complain about you more. And there are other lines you don't cross, or else you'll be the center of a giant scandal and maybe get shut down.
Are you... are you watching the same country I am?
I don't know about Scott, but I'm watching the country where Dan Rather and Brian Williams lost their jobs for crossing those lines. The lines may not be drawn where you'd like them to be, and you may be outraged by what goes on on the other side of them, but they do exist and they mark a (somewhat convoluted) safe space for accurate information.
Ivermectin ... probably does some good. For most of us in the northern hemisphere, we don't carry much of a parasite load ... yes, carrying a parasite load is a thing. For others who live in wetter climates, where eating a tomato fresh off the vine can expose you to parasites, things are different. Probably all of us carry some slight parasite load, and setting those parasites back a bit, probably benefits us. Antihelmenthicides don't necessarily kill off all the parasites like a magick silver bullet, but just set them back a bit ... or maybe quite a bit based upon dosage. Does Ivermectin have other benefits? Yes, I read years ago—when I was a cowboy, and using a lot of ivermectin—treated populations in South America saw a reduction in certain types of cancers. Not that I ever intentionally treated myself ... but the form of ivermectin we used was the pour-on form. You have a bottle with an open chamber on top, you squeeze the bottle and the chamber on top fills with the desired dosage ... and ideally you pour the ivermectin in a stripe down the center of the animals back, just as you do with flea drops on your cat. But there you are with an open cup of ivermectin trying to pour it down the back of an eleven hundred pound animal that is scared, fighting, and doing a pretty good job of kicking your ass ... and things get a little wild, and you wonder who received the better part of that dosage. But back to Dr Malone ... who developend mRNA technology, who is fully vaxxed, and who works with alternative uses for existing meds ... and suddenly this guy is a pariah. Something is going wrong.
Now on to Climate Science ... all you have to do is read the Climate Gate files, and consider that anyone who says "whoa, lets think this through" and that person is labeled a science denier. Just last week we learned that climate change causes volcanic eruptions ... Ummm where are the adults in the room. And climate change causes floods, and droughts, and warming, and cooling, and everything is weird because of climate change ... and maybe the actual real changes due to climate change are so very slight that no one ever will feel it ... unless you're on the spectrum, and you can see CO2—a clear gas—in the air.
Here's the real problem with climate change. Global warming is causing polar ice and glaciers to melt. This increases sea level. Increasing warmth causes sea water expansion further increasing sea level, as a matter of fact, this causes acceleration in sea level rise. Great, we have something we can measure. Now go look at actual sea level rise data from NOAA. https://tidesandcurrents.noaa.gov/sltrends/sltrends_station.shtml?id=9414290
Its about 2mm per year, and as you can see, pretty flat and steady. According to the IPCC AR5, CO2 was not in sufficient concentration to affect the temperature until about 1950. But we see in San Francisco data that global warming caused sea level rise back in 1854, a full 100 years before CO2 could have done the job. So there's a lot going on that we're being lied to about.
> In the end, I stuck with my believe that ivermectin probably didn’t work, and Alexandros stuck with his belief that it probably did. I stuck with the opinion that it’s possible to extract non-zero useful information from the pronouncements of experts by knowing the rules of the lying-to-people game.
What do you do when you don't know those rules?
After all, you can't learn them by asking people; they might be lying to you
> The people who lack this skill entirely think it’s crazy to listen to experts about anything at all. They correctly point out time after time that they’ve lied or screwed up, then ask “so why do you believe them on ivermectin?” or “so why do you believe them on global warming?” My answer - which I don’t think is an obvious or easy answer, it’s a bold claim that could be wrong, is “I think I have a good sense of the dynamics here, how far people will bend the truth, and what it looks like when they do”. I realize this is playing with fire. But listening to experts is a powerful enough hack for finding the truth that it’s worth going pretty far to try to rescue it.
If I understand you correctly, you're essentially saying "trust the experts if and only if they say things you already independently know (or strongly suspect) to be true".
That's equivalent to saying "don't listen to experts, just listen to yourself".
So at least we both agree that we should not listen to experts, because they lie.
No, he's saying trust the experts only if you know the rules governing what the experts will and will not lie about and how. That's a different thing, a more easily learned and generally useful skill than e.g. specific understanding of climatology.
> And the clueless people need to realize that the savvy people aren’t always gullible, just more optimistic about their ability to extract signal from same.
After watching what you fascists did to my society in the name of 'health' last year, there is zero signal to extract and you're all just either hallucinating one, or brazenly appealing to authority to attempt to pull rank over me
Always looking for new people to add to my list of interpreters. Anyone brave enough to post some they've found? Joe Wisenthal in finance (odd lots podcast) is my favorite example of this.
>But the experts in the field won't lie directly.
To an extent. Here's some examples of tactics common among historians:
1) Quote someone approvingly, or build upon their work, etc. etc. while leaving out or brushing over their lies (or directly quoting the lies approvingly, but leaving just enough room so you can say they said it and not you). See for instance Vince Deloria and Red Earth, White Lies.
2) Make an unsupported statement that at the same time can't actually be disproved. Very common in stuff like art history, where you can make all sorts of claims about the author's intentions. My favourite example is a claim that there is a link between trains and atrocities like the Holocaust - not based on any claim that railways make committing atrocities easier, but rather based on some claim about passengers being essentially trapped on a train until it stops - again, not in any sense that the supposed link was because it made committing atrocities easier, but rather something psychological.
3) Just plain lying and hoping no one will notice. See Arming America.
A few hours late to the posting frenzy and not sure if anyone will read this, but this made me think a lot about my bounded distrust of science (I'm a biologist) and other peoples' trust in science. I think Scott's 5-HTTLPR post on the old blog summarized the dynamic in science beautifully. If something becomes a possible right answer, there will be an endless stream of small scale studies "proving" it, which is why there are a few hundred studies showing a wrong link between 5-HTTLPR genotype and depression. Likewise:
- It is zero surprise to me that there are a bunch of small scale studies showing that Ivermectin treated COVID and that these did not reproduce in the large TOGETHER study. The Alexander vs. Marinos vs. Katz argument about what small studies to include and exclude seemed almost meaningless. Infinite small studies can be wrong.
- I expect there will be hundreds of studies showing that Long Covid causes virtually all human medical conditions. There are some already and will be more in the coming years.
- There will be lots of studies showing neurological impairment or whatever from lockdown or being a child during the COVID period. These are also starting to emerge
- There will be a bunch of studies showing longterm vaccine side effects, although fewer than Long COVID positives because this is a less respectable right answer.
And of course most of these findings will be wrong.
And yet I don't "mistrust science". When Pfizer does a study on their vaccine I assume it's correct and I got vaxxed and boosted as soon as I could. There's a subtle difference here. If Pfizer were doing 50 small-scale studies on different vaccines I suppose I'd group them with the myriad Ivermectin studies. But holy crap, how does anyone unfamiliar with the subtleties of our information ecosystem believe anything biology produces anymore? You are constantly bombarded with our wrong studies and yet you still haven't burned our labs to the ground yet. Perhaps there's been enough unambiguous successes (polio vaccine, your children not dying all the time) that the general population is willing to forgive us for being wrong a lot, but I feel like COVID has exposed how much wrong crap we publish in a way that never really happened at this scale before.
This article puts a label on why I was so furious about the "who cares, politicians always lie, take him literally but not seriously" business during the Trump presidency - because it imagines that all lies are equal. If the New York Times writes a slightly misleading headline then you can't trust them at all, they're as good as the people who think the moon landings are fake.
If the Soviets say that the harvest is "glorious" when they really mean "good", that's still correlated with reality - it's not as good as they claimed, but you probably aren't going to starve this winter. If they announce that *every* harvest is glorious, and each fall has greater and greater surplus no matter how many people are starving in the streets, you get *zero* information. The lie is not connected to reality at all.
We can't force every publication to provide only perfectly accurate information, for many reasons. But we can put bounds on what sorts of lies are acceptable, and make sure that they don't stray too far from the truth. And that makes it important to (1) call out blatant lies, to prevent the boundary of acceptability from shifting, and (2) draw a distinction between calling something out as misleading vs false, to avoid communicating the idea that they're all equally lacking in information.
The flip side of this is that the media was using “Trump lies!!! He’s a dangerous aberration!” as justification for more or less totally throwing out their rule book on reporting with at least a veneer of objectivism rather than open advocacy. Part of “take him seriously, not literally” was pushback against media catastrophizing everything Trump said and acting like they had never been bullshitted by a politician before, despite every one of them having the exact sort of carefully tuned bullshit meter that Scott talks about in this post.
Boring linguistic point: Abd-ul-Allah, contracted to Abd-ul-lah, means Serves-the-oneGod. Abd-ul just means Serves-the. When you see the name Abdul, it's usually barrelled with a second element, like Abdul Rahman: Serves-the-Merciful. (The Merciful is also Allah. He seems to have a lot of names.) Abdullah Abdullah and Abdullah Abdulrahman would be fine names for a Saudi terrorist. Abdullah Abdulhussein would not, since Hussein (handsome one) isn't a name of Allah, and Salafi Muslims sniff at names that imply servitude of anyone but Allah.
This is the best nitpick I've read in ages, thank you.
I describe mainstream/liberal media (NYT not Fox) thusly:
"They won't lie to you, but they won't always tell you the truth"
Historians understand problematic issues related to "objectivity", and journalists are even more subject <sic> to these because of the temporal difference. A prior, clear statement of principles and beliefs has always been useful for evaluating historical or journalistic interpretations of data or evidence. If someone tells me where they stand then it really helps me with verifying (or falsifying!) their ideas.
Wow, did you realize when you published this that Alexandros had already quote-tweeted what looks like a smoking gun on IVM? Author of a meta-review seems to admit he knows IVM works, but that he plans to ease into that result over the next 6 weeks, allowing hundreds of thousands of needless deaths in the interim. https://twitter.com/alexandrosm/status/1486136274385702912
Quite the contrary . . . Hill seems quite biased towards trying to prove that ivermectin "works," even though he knows that without the fraudulent and biased studies, the overall evidence shows that it doesn't really work. It is completely absurd to think that ivermectin would save "hundreds of thousands" of lives in a few weeks . . . if ivermectin had that strong of an effect, it would be like Gleevec, and no one would be quibbling about the evidence.
I have no idea what source you have in mind here but apparently his conclusion was given to him, if this source can be believed.
“Hundreds of thousands” is me rounding down from an 80% recovery rate as apparently discussed earlier in the call.
https://www.worldtribune.com/researcher-andrew-hills-conflict-a-40-million-gates-foundation-grant-vs-a-half-million-human-lives/
There is absolutely no reason to think that ivermectin would prevent 80% of deaths.
On the autistic spectrum, this is the story of my life.
Everyone routinely states falsehoods, and I don't have the brain wiring to easily and intuitively figure out their intentions, including whether there's any intention to deceive.
I stated one in my second paragraph - somewhere, there's almost certainly some human who never states falsehoods, even if it's only a pre-verbal infant.
A high functioning autistic learns, often painfully, the specific rules that govern the statements made in their cultural niche, and what variations are common or possible. They then watch these deduced rules get violated in increasing numbers, conclude there's been yet another cultural shift, and work on deducing the new set of rules. (Or they move to a new sub-culture, and find the rules there unexpected, having been told (falsely) by non-autistics that their local rules are self-evident aspects of human nature.)
FWIW, at the moment I don't have a good sense of what falsehoods are acceptable in advertisements made in the US - beyond far too many. I don't have a good sense of what falsehoods are OK in support of political positions - is there *anything* Trump wouldn't say, if it benefitted him? And while I don't expect public figures among his political opponents to make *checkable* false claims about elections, or the numbers at a public event, that's about all I'm sure they wouldn't do.
Maybe the Soviet Union used coded language, as you suggest, preserving meaning once one learned the secret decoding key. That's certainly common in both resumes and job advertisements, with a few outliers producing entirely fake experience, and (more commonly) non-existent bait-and-switch job ads. There might be a similar way to decode advertisements and political speech, but AFAICT it's easier to make no checkable claims, beyond "this product is infinitely wonderful [in unspecified ways]" and "our political opponents are infinitely evil [attached to a list of lesser sins, often unverifiable]".
With the rules changing constantly, I'm not entirely sure it matters. Major mistrust makes sense, along with a heavy dose of epistemological uncertainty.
This and the EEG study arrived in my inbox 3 hours and 2 minutes apart, and this amuses me.
I like this post, I agree with it in theory, but in practice I kept saying "but I'm not sure you are making your point here." For example, the argument that Fox wouldn't report on a mass shooting event if it didn't happen while other news wouldn't report on there being no fraud if there was. These don't seem equivalent to me. One produces tangible bodies. The other produces anomalies in paper trails that due to our preference of anonymity over security are hard to validate. And I say that as someone who doesn't think there was (above normal background levels of) fraud in the election.
And this makes the proximity to the EEG post amusing. Because here is batch of articles with easily referenced evidence with holes ready to be poked in that yes a handful of experts I don't follow on a platform I don't use shot down, but it isn't like the usual suspects are going to substantially correct their news articles are they? What is the difference between yet another poverty/child development story and yet another there was no fraud story? At what point can you be confident you are actually threading that needle instead of just suffering from Gell-Mann Amnesia?
This reminds me of a post by Jacob years ago, something about how unless you are REALLY REALLY smart, taking the stupid route in a game is more effective than the slightly smart route, and it's hard to know which category you are in.
The best analogy is the legal system. Everyone understands that the lawyers are advocates for opposing sides and that they will be cherry-picking the facts and spinning the conclusions and inferences to be drawn from those facts. They are both "untrustworthy" as statements of the truth. But, the two arguments (and counterarguments to those arguments) are expected to include the relevant facts and analysis to get the truth. So the Judge has to winnow out the inadmissible and unreliable evidence and decide who is most persuasive.
Basically, you have to read the NYT as a plaintiff's brief in support of the woke left's case, and then go out and find the counterarguments and act like a judge.
Maybe tangential, but I’m not sure “everyone” gets this. At least not in the sense that “the lawyer’s job is to advocate for their client to the best of their ability, within the rules”. Can’t count how many times I’ve heard people be shocked and appalled that a lawyer might, say, question the credibility of a victim. That is literally his job!
Great article.
Sorry typos, I am on my phone.
people, your readers expecially (it seems), are scared to believe that they aren't tracking geo-politics. You aren't.
you pay attention, but what you pay attention to is a moving target reported by interested parties. It isn't what they report so much as what they don't/wont/ aren't permitted to report.
you can't spend you life pretending that knowing that believe that you are ABLE to track what is going on better enables you to actually do it. You can't, and it doesn't (in my opinion).
I am not a fan of corporate media, so you could easily dismiss this as an argument against certain sources of information..
Travel the world, notice the discrepensies.
ask koreans how much they actually think about north korea in everyday life.
ask an israeli citizn soldier what they KNOW/witness on the front lines (take with grain of salt of course, but ask ten!).
These facts you can hold to be self-evident.... we do not have control over what we do not know, and every "fact" we are willing to gobble up that is afforded us by interested parties can be, and often is, a tool used as a means futher an agenda.
it is that complicated and it is that simple.
Might be beating a dead horse here, but since it's relevant to the topic: I used to think the same way Scott does, until COVID happened.
Surgeon general: "seriously people - stop buying masks! they are not effective in preventing general public from catching coronavirus"
Fauci said something similar. (he later commented that he knew the masks worked, but needed to say it to save resources for medical staff)
Using weasel words to imply one thing but actually mean another is part of the game, outright saying falsehoods kills the game.
This is a great post!
Personally, I like to think of myself as a savvy conservative. I watch both liberal and conservative media, and I try to understand the biases and compensate for them.
However, personally I think there is a real problem with the way we do science. For example, if it's true that immigrants cause more crime (and it generally is), then scientists shouldn't be punished for saying that. Truth should be the ultimate defense against accusations of being a bigot. If immigrants do indeed cause more crime, then THEY should be the ones who suffer the consequences of that fact - not the scientists who simply point it out. Because ultimately, it's the behavior of the immigrants that needs to be corrected, not the behavior of the scientists, and that behavioral adjustment can't happen if scientists are attacked simply for pointing it out.
Part of the reason that I spread conspiracy theories to destabilize the status quo and bring about a new world order is that I believe a society where we are not allowed to talk about objective scientific data without being accused of heinous thoughtcrimes simply because we are "going against the narrative" is an evil society that deserves to die and be replaced by a society that has more respect for the truth. Do I genuinely believe most of the conspiracy theories I spread? No, of course not. But just as our elites are willing to harm innocent scientists for pushing an inconvenient narrative, I am willing to harm our elites in retaliation for them harming those scientists. If they push us, we push back. If they put one of us in the hospital, we put one of theirs in the morgue.
Institutions that are allergic to truth are evil garbage and the people in charge of those institutions or societies need to be cut down and replaced by any means necessary. If they don't like it, then they should start showing more respect for the truth and stop attacking people just for pointing out inconvenient facts that interfere with their desired narrative. We won't ever be able to eliminate tribalism until we're willing to hurt people for demonstrating that trait. And if we want to live in a high trust society, then we need to make truth our most sacred value. Without that, trust collapses and society falls apart.
As a good friend of mine once said "Aim high, but hit low."
*Who* is not allowing "us" to not talk about objective facts? I suggest it's the people in the mirror. It's we ourselves, in various tribes, that stick our fingers in our ears and shout down the voices we don't want to hear. Facebook and Youtube don't censor shit primarily because the Illuminati or George Soros insist upon it, but because that makes their platform more popular among a substantial demographic -- because the bulk of their own users demand it. People *love* to find a Judas goat and drive it out, it's one of our most favorite forms of social entertainment and team-building exercise. (One is tempted to recommend Shirley Jackson's immortal "The Lottery," or adduce the Aztecs ripping the hearts out of children to remind the Sun not to forget to rise.)
We all love to have "like" and "dislike" buttons on every act of intercourse we come across, so we can express our intentions. *We* demand the censorship of those voices we don't want to hear, and the people who earn their living tending to our communication wishes oblige, since it puts $$ in their pocket.
If you want it to be different, you need to build a stronger social mythology where hearing people saying (what to you seem) dickhead things is something with which you're expected to put up. It's curious that you think that way, care to elaborate why, preferably with factual observation? None of this crap about "oh we ought to be able to ban lies/disrespect/misdirection/rudeness," because it is just far too easy for the wolf of ideas censorship to creep in under the sheepskin of civility enforcement. That's why the First Amendment doesn't admit of any "hate speech" exemptions, our ancestors were less foolish than we are these days in that regard.
You make an excellent point, and I fully agree with you. The main force calling for censorship are the ignorant masses - the stupid narcissistic sheep who believe that anybody who disagrees with them is evil. How do we change that?
The answer is simple. The best way to turn something into a sacred value is to punish anybody who disrespects that value. For example, people currently respect the sacred value of diversity because you can get fired for disrespecting that value. Imagine if disrespecting the sacred value of truth was enough to get you killed? I bet you that our societies sacred values would shift from diversity to truth REAL fast.
You might say that the people pushing for censorship outnumber us. My counter argument is that we're much smarter than them, and there are a million ways for us to manipulate them into destroying themselves, or playing them against a superior force, or manipulating the electoral process so that we can elect political leaders who literally wipe out anybody who believes in censorship. Wolves should not fear sheep. We need to remind the censors and the cancellation mob what fear is. Currently you fear them, and that is the inverse of how it's supposed to work.
So, no censorship unless someone believes in censorship (as defined by you), at which point you should have the right to put their head on a spike. Fair play for all, save those who disagree, in which case you should be allowed to conduct total war against them. Is this the libertarian version of that malapropic formula of the Paradox of Tolerance?
Yeah I don't agree with that at all. Far as I can tell, this is the exact line of thinking that led from the French Revolution to The Terror: "let us just promptly cut the heads off of everyone who doesn't hold rigidly to the revolutionary principles. mon frere. Liberte, egalite, le guillotine!"
Not for me. Not intered in a police state, or a theocracy. Anyone who says "give me power so I can punish those who don't think correctly" is someone who automatically goes on my list of people to be exiled, come the revolution and me on the Committee of Public Safety.
I just have a really hard time believing in the "spread lies in order to create a society that tolerates truth" algorithm. It's certainly immoral, but I think it's probably also highly ineffective.
I also don't have a good solution or strategy to offer, though. Certainly amplifying alternative viewpoints and pushing back hard on the credibility of establishment sources seems necessary, but spreading things I know to be false seems counter-productive (and wrong).
It sounds like you're trying to categorize the types of conditions where biased agents are most likely to lie. Instead of doing this bottoms up, as you do with your examples, you should try tops down. You might come up with better buckets than I do below.
My belief is that transparency and access to the same data from multiple observers is where lying is least likely to occur (your example on shooting and the suspect) because it's falsifiable. When there isn't transparency and only a small number of people have access to the direct data (e.g., early 2020 information on natural vs. lab leak origins of COVID), lying should be the starting assumption. If the data isn't easily falsifiable ("sorry, that's a proprietary data set"), and someone has an incentive to lie, there's probably lying going on. Especially true if there is data sitting somewhere that intentionally isn't being made available.
This reminded me of this Military History video. It really made me think about what it would be like to grow up exclusively within a biased environment, and how hard it would be to recognize it - not recognizing the bias, but recognizing the scope.
Soviet Perspective: Invasion of Poland 1939
https://www.youtube.com/watch?v=pqiHjANZQXc
Also on the dystopian scifi/fiction front, Kameron Hurley's "The Light Brigade". Even when you know someone's lying to you they can still fool you into believing a different lie.
Oh yeah, Three-card Monte - scam people by making them think they understand the scam.
I grew up in socialism, and we were taught that the system is perfect, but of course individuals are imperfect, and there is also active sabotage by enemies. So whenever you see something wrong, it is easy to assume it was either a sabotage or a mistake.
When you get lots of data, then you realize that the system itself must be broken, otherwise this would imply too many coincidences (statistically unlikely) or too many enemies (but why would a perfect system generate so many enemies?). But when you live in socialism, lots of data about its failures is precisely the type of information you are prevented from getting. And if you are in a position to get lots of data, you were probably already filtered for your loyalty to the regime no matter what.
Another way to get red-pilled is when a "mistake" is related to something you strongly care about, and when you naively assume it was an innocent mistake and keep trying to fix it, you are met with resistance that completely does not make sense in your model. Still difficult to generalize that the entire system is broken (not just one of its parts).
You've got to be impressed by a system that can get millions of people to *believe* that the system can be perfect although made of imperfect parts. That makes as much sense as thinking you can disassemble a Trabant and use the parts to build a Maserati. My working theory is that only the intellectuals were sufficiently disconnected from reality to swallow that laughable proposition.
This account of how less sophisticated people think fits my experience as an attorney in contract negotiations between companies and unsophisticated parties (such as the negotiation of an easement for a utility line across somebody's property). The problem is not that unsophisticated people are are credulous; it is that their suspicion is unfocused and random. They lack the basic skill of reading a contract and distinguishing between boilerplate terms and material terms. So they will sometimes fixate on completely innocuous terms that are not really up for negotiation and miss the places where they are expected to barter for better terms. (Hint: ask for more money.)
Similarly, my grandfather's dementia manifested as a paranoia about financial matters - which is apparently very common. Maddeningly, this fear actually made him more vulnerable to people selling dodgy financial products, who sold them as providing greater financial security.
I do think that the general public are well aware that educated people tend to hide their lies in equivocal language and the things that they don't quite say. This is why attorneys preparing for a jury trial look for expert witnesses who are willing to speak bluntly and stick to their answers under hostile questioning - they are more believable. The heuristic of only believing experts when they simply and bluntly works pretty well under most circumstances, but it is vulnerable to exploitation by con men.
I’m surprised you did not note one very big caveat, considering you yourself have written about it recently.
So, you will not hear any experts say “immigrants definitely do not commit more crime in Sweden”. This is, as you correctly note, a bridge too far.
But what WILL happen is that journalists and politicians will say something like “there is no scientific evidence that immigrants commit more crimes, only a racist would believe such garbage” and there will be deafening silence from the experts - no one will speak up and say “well yes technically that’s true but only because you’ve made it literally illegal to publish any such evidence”.
The IPCC will write a carefully researched, appropriately caveated report. NYT will dutifully report the worst case 3 sigma high end of the model as an inevitable catastrophe, AOC will use this looming doom as a justification for universal daycare, and CNN will collectively pretend that tornadoes never happened in Kansas before global warming. All of them will claim that they are “trusting the experts”. And from the “experts”? Crickets.
So the experts themselves don’t blatantly lie, but if they allow their expertise to be cited in the furtherance of a blatant lie, well, it amounts to the same thing.
Give me a break. If scientists took care to pen closely-argued op-eds in the popular press debunking every dumbass extrapolation or bogus interpretation of their work, they wouldn't have time to take a crap, let alone do real work. Obligatory SMBC:
https://www.smbc-comics.com/comic/2009-08-30
People are like that. You discover the cool fact that the uranium-235 atom actually decays by fission and released a bunch of neutrons, and the politicians and generals rush off and build 20,000 1MT nuclear warheads on hair-trigger alert so everybody has to live under the shadow of instant vaporization for 45 years. Oops. You figure out how to send mail electronically, so it gets there in seconds and cost almost nothing, and entrepreneurs invent spam until 80% of Internet packets are junk. You write some brilliant networked-computing protocol, and right away some Russian gangs get to work exploiting it to build ransomware.
Since the work of almost any of us can be used for evil, maybe some kind of mutual agreement in which we always blame the sword-wielders instead of the blacksmiths or metallurgists would contribute better to social harmony and functional discourse?
That doesn’t work when experts DO take political stands, when they agree with them (e.g. epidemiologists praising BLM protests). If they ignore misrepresentations that they like, while jumping on misrepresentations they don’t (e.g. the Lancet letter about lab leak “conspiracy theories”) they lose the benefit of the doubt that they “just can’t be bothered with dealing with every misrepresentation”.
And no, they can’t go attack everyone who is wrong on Facebook, but when national level politicians are using exaggerations and misrepresentations to set policy - hell yeah I expect somebody to speak up.
Saying, “we’re just scientists, we can only do the science part, misinterpreting it or using it for evil is all on you guys” only works if they actually only do the science part. Once they start arguing for policy and taking activist positions, I think it’s fair game to question the things they choose to ignore.
Feel free to savage individual experts who take political stands. I'll be right there with you, throwing rocks -- er...assuming I agree the political stands are bullshit. I have no respect at all for a scientist who trades on his PhD to pretend to any more authority outside his area of expertise than Joe Sixpack. That's like "I'm not a doctor, but I play one on TV, and here's some medical advice..." Or like Hollywood actors on their 6th marriage lecturing the rest of us on morals. Contemptible.
I took issue with your sweeping statements about what "the experts" should *all* or *collectively* do to make sure their work is not mis-used. That's a bridge much too far. I believe in point-of-action individual responsibility, full stop. The guy who pulls the trigger is responsible for the murder -- not the gun, not the gun manufacturer, and not his mother for beating him twice a day every day from ages 6 through 16.
Yes. It is worth distinguishing between the clout chasers on social media who use their “expertise” (which may be something like has degree, is an associate professor) to boost their political activism or promote themselves as a brand, and the much larger group that is not engaging in such irresponsible behavior.
The activists and clout chasers absolutely harmed public trust, and it’s reasonable to assign them some portion of the blame on things like lack of vaccine uptake.
But it’s not reasonable to blame other experts for failing to silence them.
Yeah I know a few who should be shot, and I would definitely vote to revoke their guild privileges, were I still invited to the membership meetings of the Secret Brotherhood. But alas I was caught in the fornicatorium with one of the vestal virgins, so I'm no longer.
I don't think it's worth the effort to dig up examples, but there have been countless outright lies from "experts", just that I am aware of in the past 2 years, at least regarding public messaging.
The heuristic of "they will mislead, but not tell falsehoods about objectively untrue things" is unfortunately not accurate on too many subjects where the foundations of civil discourse and sensemaking have been thrown out the window in favor of what would be effective at achieving a person's goals.
"The savvy and the clueless" - and now tell the one from the other plus/or make the clueless accept that label. Hopeless: 1. Some anti-vaxxers are post-grads, reading lots, and mail you dozens of links for each of their points. (One colleague of mine). Many of those do not even look bad. 2. I quick-check the news on a mainstream website. Another colleague (second. edu. only) sees that and says: "Oh, that is such an obviuosly biased source!" 3. Matt Ridley, biologist and science-author: "At the time, given that I had written extensively on genomics, I was asked often about the chances that the pandemic started with a lab leak and I said this had been ruled out, pointing to the three articles in question. Only later, when I dug deeper, did I notice just how flimsy their arguments were." https://www.mattridley.co.uk/10784?button Titled: I WAS DUPED (btw. Scott Aaronson has a nice review of Ridleys new book https://scottaaronson.blog/?p=6183 "Briefly, I think that this is one of the most important books so far of the twenty-first century."
4. Who is savvy? Who is clueless? Why should anyone pay for a mainstream media, that has to be read as the "PRAVDA"? - Would the Marx/Lincoln tale have been written, if the author thought the WaPo cared for facts? Is the author banned now? Did the NYT fire the author of that Scott-hit-piece - or the whole board? As Scott wrote: I don’t want to accuse the New York Times of lying about me, exactly, but if they were truthful, it was in the same way as that famous movie review which describes the Wizard of Oz as: “Transported to a surreal landscape, a young girl kills the first person she meets and then teams up with three strangers to kill again.”
5. Nearly no one is evil (journalists are not, just not up to it), nearly everything is broken. Some will try to be "savvy" on Joe-Rogan-level - some turn to Scotts (Alexander, Aaronson, Summner 9 - some to Scotch. SLÀINTE MHATH'!
The uncomfortable truth is that - at an elementary level - statistics for journalists are simply numbers selectively used to validate a predefined narrative. Any applied use of statistics is simply not required to get followers. Certainly stats are too abstract to apply to "experts" who provide agreeable information. Journalists you read/hear from today don't have have to have had stat(s) training, or designed any study, defined assumptions, collected unbiased data from unbiased population(s), written null hypothesis', or ever been peer reviewed. The truth is that mulit-variate analysis would correctly provide answers to so many (Covid) questions but is so foreign and difficult that in trying to meet daily journalism deadlines such reporting is too difficult, boring and complex. Moreover, such analysis simply reduces what otherwise could be a truly "sizzling" headline supported by poor assumptions and little statistical accuracy. And, THAT my friends is how journalists make money - getting followers - NOT by publishing the truth. If you're reporting on a car crash, sea turtles on the beach, sports or weather, not stat knowledge required.
This reminds me a lot of https://slatestarcodex.com/2017/10/23/kolmogorov-complicity-and-the-parable-of-lightning/
Everybody knows!!
In the mainstream media, I feel that *actual lies* are rare enough that one should mostly not expect them even from less-trustworthy outlets. I think Fox News tends to be very misleading, but I also think that's a result of topic and perspective selection, mostly not actual lies outside of some occasional non-host lies which go less challenged than they perhaps should be.
The WaPo story is common enough on the left end of the spectrum, which is where my bias lies... Not a blatant attempt to mislead, exactly, but involving some pretty big assumptions or leaps that make me raise an eyebrow. This is the sort of thing I've come to expect from political media I agree *and* disagree with, just as sort of a cost of doing business of engaging with political commentary.
In science, however, I'm getting *much more* open to making the assumption that a given researcher is a big damn liar. There have been way too many studies in the last several years, esp wrt COVID, that simply cannot be the result of good, honest scientific effort. This has to be true whether you're on the mainstream or skeptical side of the various COVID arguments; SOME of the research has to be bullshit.
I've actually been having this conversation with my younger sister. She's gotten increasingly conspiracy-minded in the last few years and I keep trying to tell her that *yes*, I am aware this or that group is not entirely honest but that is not the same thing as fabricating a new reality entirely from whole cloth. There is only so *far* you can stretch the truth before ~nobody believes you.
It sounds like you're prejudiced against conspiracy theories. A conspiracy theory is just like any other type of theory. Conspiring is one of the most fundamental human behaviors.
Yeah but ratting out your fellow conspirators is also one of the most fundamental human behaviors. Nobody has a hard time believing in visible conspiracies, e.g. political parties, PACs, interest groups, et cetera. It's the part where we're asked to believe that chatty human beings can get together in groups of thousands to hundreds of thousands and *nobody* ever spills the beans for his personal moment of glory. That's *not* human nature.
Except that they do spill the beans, all the time.
https://en.wikipedia.org/wiki/Haim_Eshed
Look, he spilled the beans. Do you believe in the Galactic Federation now? Of course not.
I totally believe in the Galactic Federation. It's the part where he claims various Earthling politicians are in the trust, and pay, of the Federation that I find highly dubious. What self-respecting BEM would hire human beings onto his Agile team? That's like Apple hiring bonobos to do the iPhone reveal.
Closely related: one of my rules of thumb over the past 5-10 years has been that if I see a complex, intelligent-sounding argument against a dominant narrative which itself seems to be completely ignored (no attempts at a rebuttal to it at least), then that contrarian argument is probably largely valid.
Counterpoint: there are good arguments that such contrarian arguments are following a sort of "just asking questions" routine where they just throw stuff out there for the purposes of poking holes in the dominant narrative, and that it's a waste of time to try rebutting them on the grounds that it's easier to start a bunch of fires than to put them out, etc. (e.g. see Sam Harris' attitude towards anti-vaxxers such as Bret Weinstein), and that this explains why certain arguments aren't getting engaged with.
To your counterpoint - usually this sort of thing can be dismissed with “even if I admit all these facts, does it actually disprove the premise?” Because usually “hole poking” type arguments end with a big leap from the facts to something like “and therefore we know these guys are wrong/lying about EVERYTHING”, and without that leap the holes aren’t all that compelling.
I agree that journalists, experts, politicians, Very Serious People, etc. largely follow the rules of their game when communicating. I don't agree that this makes their communication trustworthy in the sense Scott has in mind, where the savvy can extract a similar (though maybe weaker) signal to what we'd get if we had the unvarnished truth.
The problem is selection effects. To take an example that Scott has mentioned before: how much of a difference in, say, New York Times reporting would we expect to observe if the number of unarmed black people killed by US police in a year was 10 vs 100 vs 1,000 vs 10,000? Surely it would be dwarfed by the difference caused by (let's charitably call them) "consensus effects", where the NYT gauges the importance / tone / narrative of the subject in public discussion (especially elite discussion) and adjusts its coverage accordingly? Yet for any single article it's near-impossible to tell how much it's being driven by consensus vs reality. You'd have to undertake a dedicated long-term research project to extract the "good vs glorious" kind of signal.
Similarly, imagine the concerns of the liberal watching FOX News in the Yankee Stadium hypothetical. They could agree that FOX was reporting facts while still being deeply concerned that its framing and emphasis were calculated to stir up Islamophobia, and that it wouldn't have covered an attack by a white American shooter the same way. And they would have a point.
In the case of news organizations at least, we can confidently extract valid signal from them about certain kinds of ground-level facts, but 1) we rely on them as much or more for *analysis*-- an overall picture of what's going on and why-- and 2) they have almost unlimited influence over *what facts to show us* and have shown willingness to use that influence in the service of their preferred narratives. The fact that they follow a set of rules requiring them to be honest about the ground-level facts they do report is a very weak constraint in this context. How much signal about the true picture of the world could actually be available from it?
This is why I think the NY Times gradually eroding their reliability is a serious issue.
If I have proof of some government wrongdoing—something so outlandish that sounds like a conspiracy theory (Gulf of Tonkin incident faked! FBI tried to blackmail MLK into committing suicide! CIA proposed killing Americans and blaming Cuba!)—I want to take it to an outlet that:
1. Has the resources to investigate and verify my claims.
2. Has the resources to protect their sources.
3. Has the reputation such that if they publish it, *everyone* takes notice and takes it seriously.
For the longest time, I think that was the NY Times more than any other outlet.
If Jacobin publishes this, we'll all roll our eyes and say "okay, sure". If NY Times publishes it, we all agree to take it seriously.
Except now...some people won't, and that's reasonable, for the reasons Scott detailed. NY Times isn't likely to make something like that up, but they're not a bastion of truth. They're merely a bastion of truth-when-it-really-matters. Someone inclined to doubt the story will simply point to all the times the NY Times has misled readers, maybe intentionally or maybe merely negligently.
Whenever NYT or WaPo eats away at their reputation, I'm reminded of what they could be, what we need them to be, and what they arguably were for a long time: widely respected and trusted.
We have no "paper of record" for investigative reporting. I have a good amount of trust in ProPublica, but they lack the necessary name recognition. I don't think anyone's filling that void. It's a really hard void to fill.
I strongly agree with this post. It's a colossal loss for our nation, certainly related as both cause and effect to other colossal losses. It remains to be seen how bad the consequences will be.
A good example of this is Iranian state TV. They're mostly not a trustable source on most topics, but then again, they won't just lie about everything without any rules to their game. It is a critical survival skill to be able to draw those lines mentally.
Well, the writer is expert at one thing--covering up for experts who lie. But, as Dan Quale, a mental patient, says, No one is fooled. By the way, who make up the majority of mental patients? I ain't going to tell.
This seems neither kind (since its accusing Scott of covering up lies) nor clearly true (since it doesn't even attempt to provide proof for any statements). I like when the comments section is held to a higher standard than Marginal Revolution's.
It wasn't meant to be kind, it was meant to be an observation of the 'bias the author blatantly showed against common people, in favor of 'experts.' As another commentator noted, the author demonstrated distain for for common folk and it's not worth the effort do a point by point analysis to critique him.
Then what is the goal of your comment?
> who make up the majority of mental patients? I ain't going to tell.
Very curious what you mean by that, if you don't mind sharing
They (probably) mean liberals. There has been some headlines I have seen that claim democrats or liberals (choose your group) suffer more from mental illnesses than conservative people. I haven't read the primary research but the claim seems very ripe for confounders and I am skeptical that you could isolate the variables enough to make a rigorous claim.
I see, thank you for clearing this up. It seems that conservative media latched onto a Pew survey from last year which found elevated rates of self-reported mental illness among democrats. Many good reasons to take this result with a grain of salt, like you said, but here are the alleged rates:
56% of young white liberals
28% of young white moderates
27% of young white conservatives
That was an off-handed reference akin to "One Flew Over the Cuckoo's Nest" and many other commentaries on mental health practitioners.
This was a great article! One of the best in a long time.
Here's an anecdote about an experiment in which I debias my news. Last Lent I stopped reading the news, by which I mean I stopped typing NYT, Wash Po, NR, Jacobin, Drudge, or anything like that into my search bar. I feared developing an even more misleading picture of the world because of the selection bias for news stories. Head over to the Washington Times and have your attention yanked into thinking about something that you didn't choose to think about and is likely not going to instruct, deepen, or delight you.
Instead I tried to use that saved time to read books.
It somewhat worked; I read more books in the past year than in any other past year. But as I read less news my substack subscriptions went up, and now I get slightly more substack articles than I have time to read. Still, this is higher quality reading by and large. And very few substack articles I read are biased in a way I can't easily control for, for with substack I know the author's interests, values, and outlook. Is this the coward's path? It's not bias that worries me; it's that even after controlling for it, what's left is vacuous.
Books and substacks and personal emails. I don't know that I am missing anything, if I don't google the news. But would like to hear a defense of reading and checking the news headlines.
Re climatologists: When ~95% of climatologists agree on something, I think preference falsification. And the push towards preference falsification is obvious. To disagree is not to get funding. I've mostly lost faith in any climate thing I hear, first it's distorted some by the media, and then also perhaps distorted by the scientists. Mind you I can see the local climate is warming, and I'm down with more CO2 as part of the cause. (you can replace 'part' with 50% or more.)
1.) Given preference falsification, and obvious conflicts of interest, why should I believe climatologists? (I did follow Richard Muller at Berkley for a while, he seemed credible. https://physics.berkeley.edu/people/faculty/richard-muller)
2.) I would like to see some talk about possible good from global warming.
a.) longer growing season and more rain is pretty good for agriculture here in the Northeast where I live. (longer corn season, at a personal level.) Won't most temperate zones benefit from warming?
b.) How do you balance warming against the threat of the next ice age? (And why is the next ice age never talked about as a threat?)
I feel like I'm quite a sophisticated consumer of information. Over the past ten years, I've felt the level of sophistication that I'm *using* to sort out truth from error in the news going up and up and up (sort of the way you can feel your mathematics ability being stretched when you do harder problems).
The more I feel myself having to flex my news-consumption muscles, the more I think "We're all screwed, there's no way this doesn't end badly." Nothing about the past five years has made that seem wrong.
(On a tangential point: I followed both the primary sources and the media coverage of the Kavanaugh hearings very closely. Many of my priors about what the media would or wouldn't do were destroyed. I now think it's really quite a minefield, parsing the average media account of anything.)
Great post, by the way. As usual, you put things better than I could have by a lot.
Yeah so most scientists don't directly lie but instead just "bend the truth". Others absolutely do lie. Especially the ones with more political roles related to covid. See also the leaked Fauci emails.
As for the media: they don't directly lie if they think they will get caught. This is for most stories, so it is mostly fine.
Why is the icon for this a picture of the red square? The Russian media has not been particularly reliable for at least the last century, and I think this *does* include clear, explicit lying.
I think a lot of this is built on the idea that conspiracy theory believers are people who don't have good "signal decoding skills", burned themselves by trusting the experts, and are now mistrustful towards everyone. That is the exact opposite of truth, though. Conspiracy theory believers have their own "experts" and trust them far more unquestioningly and for far more extraordinary claims than progressive WaPo readers trust WaPo's history column. There is a spectrum from providing the most accurate and unbiased information to playing at the reader's prejudices and providing them with what they want to hear, with no care to accuracy whatsoever. The Washington Post is, in the big scheme of things, towards the accurate end (although it could certainly be closer). Towards the other end you have things like Infowars, Alex Jones, Wonkette etc. I'm sure there are people who distrust the full range, but 99% of the anti-establishment types are simply people who don't want reality to get in the way of their emotional fulfillment, and prefer news sources which tell them they are right and on the right side 100% of the time, provide them with a steady stream of outrage bait, etc. To suggest that relatively-accurate media is at fault for the existence of those people seems very naive to me.
"news sources which tell them they are right and on the right side 100% of the time, provide them with a steady stream of outrage bait"
WHOA, that sounds like a conspiracy theory! You think people are... conspiring... to provide outrage bait?! What are you, a conspiracy theory believer?
Anyway, you seem to be missing the point. Alex Jones throws out 10 crazy theories a day. His listeners don't just "believe" in what he says in the same way a WaPo reader believes in what they read. There is an asymmetry here: NYT/WaPo/etc. all take on the default establishment position, and then anti-establishment figures poke holes and spread doubt about those positions. NYT readers truly accept what they read as facts coming from experts, and they see these facts repeated everywhere throughout the mainstream media, confirming their beliefs. Alex Jones is up against the mainstream narrative and most of his coverage is framed around discussing and responding to it. What people get out of an Alex Jones rant is not "Yes, I truly and absolutely believe that they're turning the frogs gay", it's "Wow, the media is distorting and lying about a lot of stuff, I shouldn't trust them." The disinformation WaPo believers believe is far more pernicious because they truly believe it with all their heart, as it has been ingrained so deeply into them by a supposed consensus.
Also, your position that mainstream media is "relatively accurate" signals some severe cognitive dissonance. Being 90%, 99%, or even 99.9% accurate is not very good at all - it's quite easy to just *not lie*. The standard should be 100%.
Wrt misleading communication / suppression by experts and other "establishment" people, I think the more interesting way to look at that is the conflict between a virtue ethics framework where you are supposed to convey your best and most nuanced interpretation of the truth, and consequences be damned, vs. a consequentialist framework where experts are aware that some results will be used as propaganda fodder to support false claims, and censorship / misleading communication might well result in less lies and misguided beliefs overall. Like, studies showing that immigrants are overrepresented in violent crime will be used to convince a significant fraction of the population that every single immigrant is a violent criminal, and build a political movement on that lie, so better not give them the tools even if there is nothing wrong with the tool in itself. It's a kind of disinformation arms race - it's hard to stay honest if the other side can lie with abandon and there are no norms or laws punishing liars. You'll just get outcompeted eventually.
"2. If you just do a purely mechanical analysis of the ivermectin studies, eg the usual meta-analytic methods, it works."
Why you agreed to this I have no clue. The usual methods are to pool like studies with like studies and like outcomes with like outcomes. When you do this, and do a purely mechanical analysis of ivermectin studies, using the usual meta-analytic methods, it's not clear that "it works". And that's not just for death, but numerous secondary outcomes as well.
Exactly. Multiple meta-analyses that exclude fraud and highly biased studies have found that ivermectin has no effect.
E.g.:
https://assets.researchsquare.com/files/rs-1003006/v1/8a41eedf-879b-49b8-b2ed-99e58eccf0a9.pdf?c=1642465174
https://journals.lww.com/americantherapeutics/Fulltext/2022/02000/Meta_Analyses_Do_Not_Establish_Improved_Mortality.11.aspx
https://academic.oup.com/qjmed/article/114/10/721/6375958?login=false
Scott! We gotta stop meeting like this! Anyway, I wrote a little something on Twitter (https://twitter.com/alexandrosM/status/1486473068356591618?t=B4yxR4p4ax4bampZAlBkGQ&s=19), I hope to organize my thoughts into a full-length response, but if I had a wish it would be to devote enough time to the conversation so we can agree on what we agree on, and what we disagree on, and why. I am sensing an urge to rise to the abstract and produce an omni-explanation, but this terrain is not friendly like that. Building a position requires in-depth work, and being willing to hold an agnostic stance.
If I have just one request, however, it would be to retire or taboo words like "conspiracy theorist". It seems to function as an easy way to signal someone is wrong without doing the work to demonstrate it, and it's kind of concerning to see leading lights of the rationalist community fling words like these around, as if we haven't all read "semantic stopsigns" and the like.
I know this is a long distance to cross, but in 2022, when we know there was coordination of experts to suppress the lab leak hypothesis, there was coordination of experts to suppress the Great Barrington declaration, the experts were clearly wrong on overstating both the safety and efficacy of the covid vaccines, and masks, and school closures, well, maybe there's something missing from our model and we need to stop climbing into ever higher ground, and start double-checking everyone's work, seriously considering they may have actually gotten even more things completely wrong.
It would be great to have a proper conversation.
In June 2020 Fox News ran photoshopped images of the CHAZ/CHOP protest area in Seattle that added a gunman. https://www.snopes.com/ap/2020/06/14/fox-news-removes-altered-photos-of-seattle-protest-zone/
Bounded Distrust is also how I feel about the CDC and I can see why it sounds absurd people who don't have their distrust delineated the same way I do.
When the CDC says "Hey, we have some recommendations!" my reaction is "the only reason your recommendations matter is because some people are dumb enough to still listen to them." I repeated CDC talking points as late as May 2020 and I feel like an idiot for it. At this point, it feels inexcusable to still be non-critically repeating their recommendations and talking points.
On the other hand, when the CDC says "Hey, check out this cool new data on vaccine efficacy!" my reaction is "Oh, hey, neat-o! Hey, everybody, look at this data, we can trust it because it's from a highly-reliable source: The CDC!"
Because while the CDC will say some bonkers things, fail in astounding ways, and make shockingly misleading statements, as far as I can tell they have yet to falsify data. It's a different category of unreliability than what the CDC engages in.
Without this explanation/understanding, though, my differing reactions sound insane.
The FDA's EUA analysis, even assuming 95% vaccine efficacy, demonstrated that the risks of the vaccine for young males were comparable to (or even slightly riskier than) Covid, and then they approved it anyway. And then the same happened with boosters.
Some people in the committee actually resigned over this, which just means the same thing will continue to happen. So it doesn't really matter whether they falsify data or not - their conclusions are almost completely unrelated to the data.
Call me cynical, but I would not be at all surprised to learn that a major news organization faked a terrorist attack -- or, at least, reported on a routine mugging as though it was a terrorist attack.
I can't see a way to do that. (On the other hand, reporting on a routine protest as if it's a failed coup, I can see how they'd do that...)
> I can't see a way to do that.
A way to do what ? If you mean, "to report on a mugging as though it was a terrorist attack", then it's pretty easy: "According to some eyewitnesses, the attacker, identified as John Smith, walked into the grocery store shouting racial slurs and brandishing a weapon. We now take you to our criminological expert, Dr. McFakeGuy, who will explain how he was able to deduce the attacker's exact rank in the KKK based on his body language..."
> The reason why there’s no giant petition signed by every respectable criminologist and criminological organization saying Swedish immigrants don’t commit more violent crime than natives is because experts aren’t quite biased enough to sign a transparently false statement - even when other elites will push that statement through other means. And that suggests to me that the fact that there is a petition like that signed by climatologists on anthropogenic global warming suggests that this position is actually true.
So, the obvious counterexample is of course American treatment of race. You don't see anyone out there arguing that blacks commit no more crime than whites. But you see a very large number of people arguing with a straight face that blacks are no dumber than whites. One statement is no less transparently false than the other.
(You also see many, many "experts" telling everyone with a straight face that Race Does Not Exist, so black and white are not meaningful categories at all. The implications are never explored and "nobody" takes the claim seriously, but it has plenty of public signatories.)
I would argue that the difference is the amount of elite investment in pushing their preferred claim from behind the scenes. There is not enough cover -- yet -- for people to claim in public that blacks don't commit any extra crime. But cover fire has been laid down for the claim that blacks are no dumber than anyone else. And so there are many public signatories to that claim despite the fact that it is transparently false.
(Connecting back to global warming, we can apply my heuristic to ask how much investment has been made in the ability to make claims whether or not those claims are true, and use the answer to inform how trustworthy we think the claims that actually are made are.)
>Now suppose FOX says that police have apprehended a suspect, a Saudi immigrant named Abdullah Abdul. They show footage from a press conference where the police are talking about this. Do you believe them?
If our positions were reversed, and I was Scott and Scott was a reader, and if Fox actually lied in this way, I could easily come up with excuses as to why this doesn't count. For instance, maybe Fox got bad information and they honestly thought the police apprehended someone but were mistaken. Maybe they showed footage from the wrong conference not to deceive, but because any conference pretty much looks like any other and it's not deceit to find something that gives a good visual representation of the kind of thing alleged to have happened; they're just being overenthusiastic. Or maybe I can just say that the audience was intended to figure out the lie, since they can't fail to know Fox News's reputation.
In other words, under Scott's standards, it's too easy to make excuses and say that even a blatant act of dishonesty doesn't disprove his claim. It's unfalsifiable.
The media said that Kyle Rittenhouse killed unarmed black men. That's blatantly false. Other examples include Trump conspiring with Russia and almost anything the government and media have said for Covid that is now out of favor. By any sane standards, they have lied in the way Scott says they don't, but I'm sure they could be explained away.
For that matter, the New York Times called Scott a white supremacist. Scott said "This seems like a weirdly brazen type of falsehood for a major newspaper." Yet the Times did this all by making insinuations and saying literally accurate but misleading things, so by Scott's current standards, brazen has become not-brazen.
I see a lot of mentions of data that the Vitamin D article didn't discuss:
https://www.peakprosperity.com/2021-year-in-review-the-rise-of-centralized-healthcare/
I get the point to the article and I think I agree. I feel like I'm one of those that reads the tea leaves pretty well in both science and politics. That said, two things pop into my mind and I don't know where to fit them with this attitude.
First, I was taking a college course in the Soviet Union during the first Gulf War. Our class was staying at the Kosmos Hotel in Moscow. It was run by the Soviet state tourism agency and it did not have CNN at the time. Everyone gathered in the hotel bar on the night the war started to watch the Soviet newscast. Russian friends translated the newscast and told us that the news said Iraq had shot down over 30 US planes and the war was a disaster for the US. We heard many other things that turned out to be absolute lies. It wasn't FOX but the experience of having a news channel not only shade but outright lie about objective facts never left me
Second, we have seen many instances of outright scientific fraud. We have an insane amount of major studies not being able to be confirmed. Finally, we have pretty good recent evidence of political interference in controversial topics. I feel like I still trust the experts, but any time a topic is in anyway controversial, I get a squick feeling in my stomach.
Is it possible to watch somewhere North Korea's news with English subtitles? Especially the parts about things happening outside of North Korea. That could be an interesting experience.
Thanks for the post Scott.
I think an important issue in the current media environment is that a third category, "analysis," has become more prevalent and fits between the traditional dichotomy of "information" and "opinion". Anecdotally, I think when people criticize a news outlet for bias, it is often because they are treating an article as information when it is really analysis.
"Information" would be the kind of clear-as-day facts of the kind you mentioned. "The stock market went up today," "The President said x during his speech," "Abraham Lincoln was assassinated in 1865", etc. Of course some bias can be introduced by a publication in terms of what to cover or whose quotes to use, but the base facts are true.
"Analysis" also includes facts, but offers the author's interpretation of them. To me, the Lincoln articles you linked to are great examples of this. Just because the Washington Post article does not appear on the opinion page does not mean that it should be treated as information. The Post author and the rebuttal were using the same pool of facts but analytically came to different analytic conclusions. If someone finds the rebuttal more persuasive it doesn't mean that the Post was presenting inaccurate information, but rather you disagree with the author's analysis.
I think one of the most prominent examples today of this difference is the treatment of how/whether the 1619 Project should be used in schools. At the extremes it's considered gospel or heresy (when I would consider it historical analysis), but to me its best use in the classroom would be as a tool for debate, especially if paired with another publication that used the same facts but came to different conclusions.
Baby/bathwater
So, the key is to know your counterfactuals?
If the shooter was white, FOX wouldn't be airing the press conference at all.
If global warming was fake there would be meta-studies and analysis similar to what we see about ESP.
But of course those meta-studies and analyses all exist. You just discount them as "fringe" and ignore them.
https://slatestarcodex.com/2014/04/15/the-cowpox-of-doubt/
Whoa hang on a minute.
Maybe I've forgotten how to read, but the Marx thing doesn't suggest Trump is un-American or racist anywhere?
Like, literally where does it suggest either of those things?
I think when I try to be savvy, *I* end up sounding like the conspiracy theorist, and I think the article is missing the fact that this perception runs both ways.
For example, I generally assume any item on the nightly news about a pharmaceutical is either a native ad, or the result of a lazy reporter filling time off a press release. If Drug X really marked a turning point in the fight against cancer or dementia, you'd know it, there would be a giant cultural event surrounding this and everybody would be acting a bit differently in their presentation.
But it seems that most people believe these stories are intended to be factual, and when I tell people that pharma stories on tv are basically advertisements and you aren't supposed to believe them, they act like I'm the conspiracy theorist calling it "fake news". By your framing, their alternative to being savvy about this is to conclude that all of these claims were intended as literally true (as framed, even if the text is filled with qualifiers) and that in fact these companies are rampant liars who cannot be trusted at all -- which is what you're calling the "conspiracy theorists". But from their POV "this particular set of people are chronic liars" or "this particular type of claim is always nonsense" isn't a wild conspiracy, whereas the "savvy" person's attempt to play Kremlinologist with everything anyone says looks much loonier.
> But: have you ever heard an expert say, in so many words, that immigrants to Sweden definitely don't commit more crime than natives?
I've heard something structurally and truth-value-wise similar. I've heard:
1. Travel restrictions and gathering limits are racist scaremongering which has absolutely no effect on disease spread.
2. It will take two weeks to bend the curve and stop the spread of the pandemics.
3. Masks are only necessary to medical professionals and are useless to common people.
4. Mass gatherings and protests are good for public health and do not cause any concern even in the middle of a pandemics, as long as the issue being protested against is racism.
5. There's absolutely no scientific or at all plausible basis to the idea the coronavirus has originated in a research lab in Wuhan, it is a baseless (and racist) conspiracy theory that no real scientist or expert ever agreed with, and the scientific consensus is firmly and entirely on the side of the proven fact that it originated from an animal source without any involvement of the Wuhan labs.
6. Inflation is a sign of a very healthy economics and is very good to everybody but billionaires.
Of course, some of those had less "experts" than others to state them, and they had different lifetimes. But all of them were said in public, and as far as I know, none of the "experts" saying them were publicly shamed and officially stripped of their "expert" status forever and forced to wear the "dunce" caps. So yes, I think at least some experts would absolutely go on TV and say any lie they want, and I, usually, have absolutely no ways to tell this kind of experts from any other.
I must object to the "conspiracy theorist" classification here. We're way past "conspiracy". People lying to me on TV all day long aren't "conspiring" - they are doing their business in the open, brazenly and boldly. They are in power, they are putting the metaphorical boot on the face of the truth, and while they still fail to keep it there forever, they certainly keep trying. I don't know how to call it but it's not a "conspiracy".
Yes, we should distinguish between "secret conspiracies" and "obfuscation conspiracies".
The argument against "secret conspiracies" is that it is very difficult for a large number of participants to keep a secret. Sooner or later, someone will change their mind; and they can leak the message anonymously. Also, the outsiders who are curious about something may figure it out.
But "obfuscation conspiracies" just make something complicated, and even if the message is leaked or someone figures it out, most people won't understand it, so you just need to *deny* its translation into plain language. So people will be like: "oh, great, there is no poison in the water, there is just some molecular contamination of dihydrogen monoxide, some expert chemistry stuff, nothing that we ordinary folks need to worry about, right?"
The rationalists have a bias here - the entire rationalist exercise collapses when basic facts and data in mainstream academic papers and the New York Times cannot be taken at least somewhat at face value. When these institutions are subverted to political ends, which I posit they have been, the types of conversations that rationalists enjoy having become impossible. This is why people like Scott and Sam Harris are reluctant to drift too far from the establishment narratives around things like Covid. Scott's writing about ivermectin is engaging and I appreciate it, but it always comes across as being driven by motivated reasoning. Because if the establishment view on ivermectin is a corrupt lie (which I posit it is), then we are in a world where nearly unbounded distrust becomes appropriate, and rationalist-type conversations become impossible. We move from the world of facts and data into the world of mythology and spiritualism. Which I posit is the only rational world to inhabit these days.
Unless, of course, ivermectin actually doesn't cure covid. Hypothetically speaking.
Then, I guess, the proper thing to distrust might be people on internet tirelessly expressing strong opinions about things outside their expertise which they learned on internet from other people tirelessly expressing their strong opinions, etc.
Ultimately, whether ivermectin works is orthogonal to my point. My point is that Scott's ivermectin analysis appears to have originated from a pro-establishment bias, and appears to have been the result of motivated reasoning rather than scientific inquiry. My expertise is in motivated reasoning. I'm a lawyer. Motivated reasoning is what I do. In the case of ivermectin, Scott was presented with data that showed ivermectin worked and establishment experts saying it did not. He really wanted the establishment experts to be right. He thus searched for a possible confounder (worms!) and threw it out there, despite no studies or good data supporting this hypothesis.
The worms confounder is the sort of thing a criminal defense lawyer would come up with to raise reasonable doubt against a strong case by the prosecution. It is not the sort of thing one would come up with by taking an unbiased look at the data. An unbiased look at the data creates at least a presumption that ivermectin reduces severity of illness. Actual good evidence should be required to overcome that presumption. Scott had no such evidence to support his theory. He just had an unsupported rationalization to back up the establishment experts. Because if the establishment sense-making apparati are completely corrupt and unreliable, as opposed to merely being deserving of "bounded distrust," the rationalist sense-making enterprise becomes a fool's errand.
I don’t think this fairly describes the situation with the ivermectin studies though. From an early part of Scott’s ivermectin post:
“Of studies that included any of the endpoints I recorded, ivermectin had a statistically significant effect on the endpoint 13 times, and failed to reach significance 8 times. Of studies that named a specific primary endpoint, 9 found ivermectin affected it significantly, and 12 found it didn’t.”
This sounds like when you look at ivermectin studies in a way that limits potential p-hacking, about half the studies say it has a significant effect and half say it doesn’t. Which is weird! If it did absolutely nothing, we’d expect an overwhelming majority of studies to find no positive effect. If it treats covid, we’d expect… I think better than 50/50 in studies? This is genuinely weird. The existence of some sort of confounder makes a lot of sense. Maybe the confounder is worms. Or maybe it’s something like diet, and the drug works on covid in the presence/absence of some compound that has regional variability in human consumption. Or maybe its effect size really does sit precisely at the limit of our ability to detect, so that by chance alone it shows up as significant half the time. If that is the case though, then this becomes less important going forward as covid-targeted antivirals become approved and mass produced.
For examples on this blog where this sort of analysis results in Scott deciding that the establishment conclusion is wrong, see the EEG post from this week (p-hacking) or the post on masks from early 2020 (confounder - did subject actually wear mask after control/treatment assignment)
Small studies will be underpowered, especially for a disease as relatively benign as covid. That means nothing about their worth or validity.
https://www.nature.com/articles/d41586-019-00857-9
This is a really interesting point, because it's something I come up against in my role as a union agent. My members, out of long experience with Human Resources, come into any situation with the belief that management is probably lying to them about EVERYTHING, and I have to teach them that there is a trick to it: most half-decent HR types won't lie about certain matters--issues that they don't have a stake in, issues where telling the truth will help their cause, or issues that are easily fact-checked and objective.
It's so hard to butt one's head up against the way that mistrust spreads away from all touch with rationality.
Imagine that you are the kind of person who trusts neither CNN nor FOX. You are in an airport boarding area where a screen is showing CNN. You see news of a shocking event: not a school shooting, but a person driving a car through a crowd of people participating in parade. The text scrolling at the bottom of the screen gives details of dead/wounded, and goes on to note that "motive is unclear". A press conference with Police representatives appears to support this.
Later in the day, at a different airport, you see a FOX news piece about the same event. In that news piece, the Police are saying that motive is unclear. Yet the FOX reporters have somehow gotten screenshots of social media posts apparently created by the perpetrator. The perpetrator's social media was apparently full of racist denunciations, of the kind of people who were victims of the assault. The FOX report implies that motive is easy to discern, even if they repeat the official statement that motive is unclear.
The perpetrator was a Black man, and the victims of both the racial hatred and the attack were white.
What information do you gain from these two stories?
I wish this were a hypothetical example; it is not hypothetical. I must credit David Friedman with noticing, though he noticed this distinction at two major newspaper websites rather than on FOX/CNN channels.
http://daviddfriedman.blogspot.com/2021/11/all-news-that-fits-we-print.html
Welp. In CS there is a really obvious measure on how to handle untrustworthy sources:
Ignore them in your decision making, but track their reliability. If their reliability score recovers enough, you can again give some measure of trust to them.
The same with FOX News and the Springer Presse (aka BILD, Welt and others) here in Germany..
Concerning the Ivermectin thing:
"1. If you just look at the headline results of ivermectin studies, it works.
2. If you just do a purely mechanical analysis of the ivermectin studies, eg the usual meta-analytic methods, it works.
3. If you try to apply things like human scrutiny and priors and intuition to the literature, this is obviously really subjective, but according to the experts who ought to be the best at doing this kind of thing, it doesn't work.
4. But experts are sometimes biased.
5. F@#k."
This doesn't imply that there is a paradox, it implies that it's probable that both are true and that your model is probably too simplistic (i.e. the effect is real, but the proposed mechanism isn't)
It seems to me that the article is about two very different things. One is the limits of how much gets made up, though it seems to me there was something false about babies being taken out of incubators during the run-up to the war on Iraq.
https://en.wikipedia.org/wiki/Nayirah_testimony
Perhaps atrocities are more likely to be inventions, and speaking of Yom HaShoah, I gather that one of the reasons accounts of the holocaust were being discounted was British lies about German atrocities in WWI.
Excuse me, I might be a little distracted. However, it's quite possible for the mainstream media to get some things wrong, whether they're making up lies or accepting other people's lies.
Maybe they're mostly likely to be trustworthy about medium-intensity things. Big enough that they might be paying attention, but not so big and emotional that they can't think straight. This is only a theory though.
The other claim in the OP is that it's possible to pull signal out of media noise if you know enough about both the media and the world. This might be easer when the government has centralized control of the news.
But when there are lies and nonsense coming from several directions? Maybe not impossible, but a lot harder.
I'm reminded of a favorite bit from _Illuminatus!_. There was a man with file cabinets full of clippings* about the first Kennedy assassination. He kept gathering information because he thought there was one fact out there which would make it all come together. He didn't realize half of it was lie made up randomly by people covering their asses.
* you can tell this was written a while ago
This is such a remarkably biased article about bias in the media, like so many bizarre hot takes these days I am left with my mouth hanging open. FOX is biased but they wouldn't tell an obvious lies and make up events out of whole cloth? No, but you would have to be blind deaf and dumb to not know that CNN, MSNBC, CDC, FBI, and FDA et al, would. This is where common sense comes in. And common sense seems to be more common amongst common people. Privileged people living the high life can afford to believe things that are obvious lies to less privileged people. The lies of MSM about BLM and C19 alone are mind boggling and terrifying. Lies have been blatantly told about major cities on fire, looted, under siege; Blatant lies told about C19. How do you figure tens of thousands of doctors saying Ivermectin does work? How do you figure tens of thousands of doctors and nurses, etc refusing to take the shot that obviously doesn't work and is harmful? This was a truly weird article. Go talk to some normal, everyday people who live in normal everyday places. Never mind. I don't think obvious evidence would make any difference to the person who wrote this.
Do you have any examples of CNN or MSNBC "making up an event out of whole cloth", for events of the approximate significance of Scott's examples?
Russian Collusion?
How can you distinguish a savvy person from a gullible one when you encounter one? How can you tell if THEY are telling the truth?
This entire article was very disappointing. It was basically just Scott restating that he has authority bias in as many ways as he can think of, without actually stating it.
He goes on and on, for instance, about how FOX would never make an accusation so awful and obviously wrong that it couldn't be trusted regarding election integrity, while seemingly being totally unaware that the blue media peddled a completely made up, totally untrue, zero basis in fact "Russian Interference" narrative for three consecutive years, and a very large number of people bought it. Most of them still do buy it.
Then he pivots over to ivermectin and runs "trust the experts" again, but we know for 100% fact that "the experts" suppressed the lab leak hypothesis for fear they would get called a racist by their tribe, and we know that Peter Daszak proposed to manufacture a virus exactly like Covid-19 to DARPA in 2018 *and* to do that manufacture in Wuhan, *and* he orchestrated the Lancet Letter that said lab leak was impossible two years after he literally proposed to build the thing in the lab where the lab leak happened.
So we have evidence of "the experts" lying for tribal reasons, we have evidence of the media apparatus "the experts" use lying for echo chamber reasons. The useful thing for Scott to do would be to start by setting ivermectin aside completely, and doing the smallest remotest bit of work to understand how these tribal social mechanics can, have, and continue to make an entire body of experts *wrong*. Completely without a secret conspiratorial cabal, experts are choosing as a flock to be wrong on purpose, and we know this is a thing that is happening. Unpack that, figure it out, and then come back to ivermectin.
And good gracious, this line of reasoning:
1. If you just look at the headline results of ivermectin studies, it works.
2. If you just do a purely mechanical analysis of the ivermectin studies, eg the usual meta-analytic methods, it works.
3. If you try to apply things like human scrutiny and priors and intuition to the literature, this is obviously really subjective, but according to the experts who ought to be the best at doing this kind of thing, it doesn't work.
4. But experts are sometimes biased.
5. F@#k.
...should end at step 2. Or, if you do go to step 3, you should at least apply the same level of rigor to ivermectin that you used on fluvoxamine a month later, which you obviously didn't do, and any critical readers noticed. So then the question is "why did Scott Alexander spike ivermectin while unspiking fluvoxomine?" and the easy answer, the one most likely, after we've done all of our "why do experts lie" analysis up front, is either tribal ingroup signaling, or it's financial pressures to maintain subscriptions, which is exactly why the "Russian Interference Narrative" ran for three years anyway.
I mean, I don't see any other way to slice this.
To put a fine point on it, Alexandros used your own methodology and simply completed it where you got lazy, and the conclusion flipped from "ivermectin doesn't work" to "ivermectin works," and your only response was "it's not about paper selection but about endpoint quality." That's just word salad, dude.
I personally don't even care if ivermectin works, mostly because I'm not particularly scared of Covid, but I am fascinated to watch the Sensemaking Crisis demolish my most trusted thinkers and I'm horrified to think about what that means for the human race in general.
Point 2 gives way too much credit to ivermectin: "If you just do a purely mechanical analysis of the ivermectin studies, eg the usual meta-analytic methods, it works."
That's not the case at all. Multiple meta-analyses that try to exclude fraud and highly biased studies have found that ivermectin has no effect.
E.g.:
https://assets.researchsquare.com/files/rs-1003006/v1/8a41eedf-879b-49b8-b2ed-99e58eccf0a9.pdf?c=1642465174
https://journals.lww.com/americantherapeutics/Fulltext/2022/02000/Meta_Analyses_Do_Not_Establish_Improved_Mortality.11.aspx
https://academic.oup.com/qjmed/article/114/10/721/6375958?login=false
It's actually quite instructive your first link is the Andrew Hill meta analysis, because he admitted on zoom to faking it because he's on big pharma's payroll.
https://twitter.com/alexandrosM/status/1486136274385702912
This is the sort of stuff fundamental to the sensemaking crisis. And it's the sort of stuff Scott was trying to end-around with his worms article. And he was doing it the right way, until he got a little lazy (intentionally lazy?) and tried to manufacture a landing pad for himself that didn't put him in the IVM camp, either for $ reasons or reputation reasons.
"he admitted on zoom to faking it because he's on big pharma's payroll."
Literally nothing in that sentence is true except that Hill had a Zoom call.
He didn't admit to faking anything. To the contrary, his prior results were faulty because they relied on studies that turned out to be fraudulent or highly biased. No one has yet identified a factual or statistical error in Hill's revised article, or given a reason why it should put more weight on highly biased studies.
Moreover, he didn't say anything about being on "big pharma's payroll," which would be a weird thing to say when it's not true.
The zoom call was far earlier than any accusations of fraud. And he admits that his sponsor determined his conclusion. That right there is academic misconduct, and in context, a crime, regardless of whether ivm works or not.
Not clear what the dates are. And in the video, he doesn't admit that his sponsor determined his conclusion. Not sure why some folks seem to have such a different interpretation of this video.
It's there if you scroll down a little bit, and of course it won't change your mind, but see at about minute 9 in this link: https://worldcouncilforhealth.org/multimedia/tess-lawrie-conversation-andrew-hill/
Why let (your possible alter ego) Alexandros take all the heat here? Any response to the fact that your accusation was completely wrong?
I'm not Alexandros, and Alexandros has sometimes confused "motteposting" with me. We're not sure who he is yet.
It's pretty interesting watching the differential in discourse regarding these topics between the comments section and r/themotte. There appear to be different factions of Scott fans who approach sensemaking differently.
My approach to sensemaking wouldn't include attributing statements to Hill (or any other person) things that he didn't even hint at saying.
" Or, if you do go to step 3, you should at least apply the same level of rigor to ivermectin that you used on fluvoxamine a month later, which you obviously didn't do, and any critical readers noticed. "
Even if the evidence left standing on ivermectin looked as good as the evidence on fluvoxamine (not true), they would still be in very different standings. Fluvoxamine hasn't been the subject of numerous frauds and highly biased studies, and isn't constantly being plugged by snake oil salesmen and their fans (*Alexandros) as being a miracle cure, etc. From a rational perspective, a drug with a solid 30% effect (but no wild exaggerations and frauds) is more likely to be valid than a drug with a 30% effect that is mostly promoted by hucksters.
You falsely accuse me of saying things I have not, but of course this is par for the course. However, the amount of junk studies associated with Ivermectin has not been uncharacteristically high. About 20% of studies are generally expected to be in that category, and results from the Ivermectin literature are actually lower than that, believe it or not. On the other hand, studies from the middle east are well known to be of very low quality most of the time, which we have also seen with Ivermectin. Once again, nothing out of the ordinary, except for the poitics.
Correction: Ivermectin has been plugged almost exclusively by true believers like Kory, Lawrie, Marik, etc. They have lied about it being a miracle cure, and have promoted fraudulent studies without shame or apology. You have mentioned repeatedly that you donated to them (FLCCC). Nonetheless you haven't *explicitly* endorsed their "miracle cure" statements, but instead when pressed, back off and say you care about the rationality of the discourse, or the information ecosystem, or something like that. Fair enough.
I've donated to the people who have also been giving fluvoxamine for a year, and the people who were ahead of the curve on steroids for severe hospitalized cases. They also believe ivm works. Btw, this wording on ivm starts from Satoshi Omura himself, and of course nobody blinks when the vaccines get the same kind of characterization regardless of their limitations.
https://www.ncbi.nlm.nih.gov/labs/pmc/articles/PMC3043740/
Anyway, I'm done wasting time with you.
Not sure what the relevance of a 2011 article on ivermectin is here. It does say ivermectin is a "wonder drug"--for many different parasitic infections. But so what?
By analogy, Gleevec is a miracle drug for certain types of leukemia, but that doesn't mean anyone gets an automatic pass if they say, "Gleevec is a miracle drug for literally any other random and unrelated disease, now including Covid! Here are a bunch of crappy and often fraudulent studies, along with some eyeballing of international charts that don't even pass the laugh test, along with some cites to people who are so dishonest that they claim bicycles kill 1,000 times as many people as Covid (e.g., Marik in the recent Ron Johnson hearing)."
The notion that 20% fraud/bias is just par for the course is way too glib and facile. As James Heathers wrote (https://www.theatlantic.com/science/archive/2021/10/ivermectin-research-problems/620473/):
"If five out of 30 trials have serious problems, perhaps that means the other 25 are up to snuff. That’s 83 percent! You might be tempted to think of these papers as being like cheaply made light bulbs: Once we’ve discarded the duds with broken filaments, we can just use the “good” ones.
"That’s not how any of this works. We can locate obvious errors in a research paper only by reanalyzing the numbers on which the paper is based, so it’s likely that we’ve missed some other, more abstract problems. Also, we have only so much time in the day, and forensic peer review can take weeks or months per paper. We don’t pick papers to examine at random, so it’s possible that the data from the 30 papers we chose are somewhat more reliable, on average, than the rest. A better analogy would be to think of the papers as new cars: If five out of 30 were guaranteed to explode as soon as they entered a freeway on-ramp, you would prefer to take the bus.
"Most problematic, the studies we are certain are unreliable happen to be the same ones that show ivermectin as most effective. In general, we’ve found that many of the inconclusive trials appear to have been adequately conducted. Those of reasonable size with spectacular results, implying the miraculous effects that have garnered so much public attention and digital notoriety, have not."
The selection of the 30 trials Heathers wrote about is for the biggest results. Obviously once you select for outliers you get outliers. This is elementary, really. And we don't even know exactly which studies those are. The crusaders of transparency are shockingly opaque.
We do know exactly which studies have been critiqued by Heathers et al., because they did it publicly.
And the point about selecting for outliers cuts in the opposite direction of what you're implying here! If Heathers et al. selected the 30 (or so) most well-known studies with the biggest results, they had a sample of studies that provided the best case scenario for ivermectin working. And even then, it turned out that the studies with the biggest effects were the most unreliable, whereas the studies that looked defensible were inconclusive. So the selection bias (if any) works in exactly the opposite direction of what you're trying to imply--even with the "best" studies for ivermectin on the table, ivermectin still comes up wanting.
Go ahead, give me the list of 30 studies. Should be simple.
There's a big Canadian trucker convoy that's protesting a vaccine mandate for Canadian truckers traveling into the US.
https://www.cbc.ca/news/canada/london/trucker-protest-convoy-southwestern-ontario-1.6329118
There are articles about it by major news sources, but I haven't heard about it from NPR, and it's not listed on google news.
In general, I've heard complaints about large non-violent demonstrations not getting into the news.
Very good piece and pretty much the way I myself navigated the news for the last few years: confident I could recognise spin from what is likely to be factual and just shrug at spin and bias. However, the game changer in this pandemic was the way the idea of a potential lab leak of SarsCov2 was dealt with by the scientific community. The sheer firepower deployed by top scientists (and big tech and mainstream media) in attempting to torpedo the idea of a possible lab leak (and the careers of any one who gave it credence) became paradigm shifting for me. We learned of Gain of Function research carried out at the Wuhan Institute of Virology, NIH funding, Kristian Andersen professing to believe the virus looked engineered in private (email to Fauci) and calling the same idea ‘tinfoil hat conspiracy’ publicly, the shenanigans with the WIV Coronavirus database, RATG13 etc.
To stay with the blog’s metaphor it was the equivalent of discovering that Fox News had indeed staged a mass shooting and was threatening those in the know to keep their mouths shut.
You basically say ‘there are rules and savvy people know what they are’. This is true, but shouldn’t we all be constantly recalibrating on the basis of what suddenly enters the realms of possibility? The scientific community made an extraordinary attempt to control the narrative on the origin of C19 in a bona fide conspiracy to obfuscate and intimidate and smear anyone who knew better (with the help of big tech and the press, of course). Isn’t this a new ‘rule’ I have to take into account from now on? How does this affect the meaning of the expression 'spreading misinformation'?
Interesting to see so many commenters here saying "yes, yes, UNTIL things changed with [COVID/Trump/BLM/insert issue]!" Nothing has changed, fundamentally. You're still basically just determining how much confidence to assign to each piece of information you receive. I wonder how much of an emotional component there is to this way of thinking. People get pretty insulted or disgusted to discover they were misled about something. Committing to disbelieve the source absolutely in response, however irrational that actually is, might function as a way of hitting back.
Another idea is that maybe these people believe that overreacting to instances of media untrustworthiness will eventually improve reporting standards.
I REALLY wish some of the wording in this article was different, so that I could send it to some of the "clueless" I know 😝
This seems insightful and useful - de recently had in-depth discussions with colleagues who generally each carried what to me is a conspiratorial perspective on COVID and government response. They were actually enjoyable conversations on balance, we kept rapport, and I began intuiting underlying differences in thinking to account for my perspective seeming categorically different from theirs. One of them is scientific literacy. My greater capacity there though, may have also let me lose sight of some larger picture stuff which they were frustrated and suspicious of: "They told us the vaccines would work and now they don't". I could counter that with points about probability and risk and viral mutation - but I had extracted so much fine grain signal that I had forgot the basic point that they were pointing out: vaccines were represented as our way out of the pandemic and it is not working out that way. Which has implications moving forward (I don't believe it's because of some heinous conspiracy as they may be tempted to, but the point stands, and my perspective shifted quite significantly).
> I had extracted so much fine grain signal that I had forgot the basic point that they were pointing out: vaccines were represented as our way out of the pandemic and it is not working out that way.
This is a good point. The fine-grain version of "vaccines are our way out" is still true in a substantial (if incomplete) way, but if you hear it as something simpler that implies "if you get your shots, we'll stop trying to tell you what to do", then you're going to be pretty disappointed.
I agree that talking to people who have a hard time extracting signal can be helpful. I'm less savvy about various geopolitical things than some of my friends. I'll say something to them like "I can't tell how much of this stuff with <potential conflict> is a literal statement of intent, vs some calculated diplomatic/political move that shouldn't be taken literally. How much more worried should I be?". The answer is usually something like "It's complicated and hard to say, but you can at least conclude that <signal they've extracted>, so I'm <more/less/about the same> worried". This usually doesn't improve my ability to extract signal very much, but it does help me calibrate my confidence in that ability. This is important, because it means I don't have to adopt a stance of complete epistemic helplessness.
I think your statement”I think some people are able to figure out these rules and feel comfortable with them, and other people can’t and end up as conspiracy theorists.” is a bit condescending. Dr. Fauci’s as well as many other World renowned experts, year long denials that the virus could have come out of the lab, is now being challenged by a large number of scientific experts who are not know being labeled as conspiracists.
Can you quote some of these "denials"? If they're along the lines of his 2020-04-17 comment that "the mutations that it took to get to the point where it is now is totally consistent with a jump of a species from an animal to a human,"[1] well, that certainly isn't saying it's not possible that it came from a lab.
It sounds to me as if you missed this bit in Scott's article: "before you object that some different global-warming related claim is false, please consider whether the IPCC has said with certainty that it isn’t, or whether all climatologists have denounced the thing as false in so many words. If not, that’s my whole point."
[1]: https://www.msn.com/en-us/news/world/timeline-of-what-dr-fauci-has-said-about-the-wuhan-lab-and-covids-origins/ar-AAKn3P3
You do know Fox has been owned by Disney for several years now.
Personally I don’t trust any media outlets. We’re being told what the government wants to tell us.
Disney only acquired some of the Fox media assets. In short, Disney got the arts & entertainment, but an independent company still controlled by the Murdoch family kept the news and sports.
If we're being told what the government wants us to be told, then why do some media outlets report critically on the government?
Mirrors are for reflection. Conspiracy, is not always fact but many times just perception!
For the millionth time, puts something nebulous into concrete, precise terms and fleshes it out. Kudos. Bonus points for talking about conspiracy theorists like they're human beings - far too much of the discourse on them assumes they're deranged and tries to figure out the psychology behind the derangement, without ever considering that news sources and experts *are* disingenuous and lazy a lot of the time and it's perfectly natural to notice that and overreact to it a little. Sure there are people who take things way too far and make a whole lifestyle out of it, but I find it a little disturbing how much discussions of MSM mistrust focus on the tinfoil hat crowd, to the exclusion of people who...just don't trust the media very much.
From all the replies I gather that the author is pointing out something that can seem obvious to a small group of people, but by no means all ‘savvy people’ in general, and second to the view that many good points are raised, but a bad aftertaste is present. Ultimately it is poor practice to talk with dichotomy and spin the media bias in the author’s favorable direction.
I'm not sure the stuff about 'glorious' vs. 'good' harvests is consistent with bounded distrust. Do you think there is a bound where the Russian government wouldn't say 'glorious' when there is a bad harvest, or that the govt. would say neither 'tax increases nor revenue enhancements' but introduce 'progressive policies'?
It seems that bounded distrust makes sense in repeated games with monitoring, but that the government and parties raise some much noise that monitoring the signal doesn't happen and bounded distrust doesn't work any more. Maybe it works for science and journalism.
"Ivermectin does not work" and "man made global warming is real and dangerous" are both important points of the Left dogma.
If I am on the Left and I use my Figuring-out-the-rules-of-the-game skills, and I exclusively end up approving bits that are part of the Left dogma, my Figuring-out-the-rules-of-the-game skills should be highly suspect.
I should therefore really, really be looking to approve a sufficient number of equally highly valued bits of the Right dogma. I would search for them, record them, and put them on my wall in picture frames.
Didn’t Fox News get caught Photoshopping photos of suspects to look more middle eastern on more than one occasion?
But it didn't use to be like this.
I think it is a great overcomplication of two simple rules:
1. Is it profitable/convenient/useful for the perpetrator of the lies?
2. Can she easily get away with it?
If the intersection of these 2 factors is sufficiently good, lie will be perpetrated.
I always use this criteria and am right in nearly %100 of cases. Of course, it comes mostly form the generalized Left. Right-wingers that try this don't live long, politically, professionally and physically. We still have some standards!
--
Cohen the Barbarian
And, yes, of course, being in airport and hearing Fox News on TV is as implausible as UFO stories. The only location where I saw Fox on TV in a public place was certain kosher restaurant on 47th between 5th and 6th and it happened probably over 6 years ago. Don't know if tehy survived COVID.
--
Cohen the Barbarian
Scott's reflexive hatred of socialism shines through all half-hearted attempts at seeming objective. Sad, but I suppose it is inevitable based on the rightist slant in even our "moderate, centrist" discourse.
Now I'm wondering if 4 out of 5 dentists really do recommend Dentyne for their patients who chew gum.
Awesome. This seems to omit discussing that our very standards of truth are often downstream, internalizations of these external institutions "truths". Most peoples "truths" are composed by a fractured, socially reinforced memes that have very little to do with reality testing, cohesion, or explanatory power. For these people, the heuristics of __don't/do trust the experts__ isn't as important as "does this vaguely jive with my internalized metrics of when I believe something".
This seems important to mention because conspiracy theorists do have experts and authorities, they just code and signal their "truths" in a way that conforms better with the conspiracy theorists' internalized truth metrics. Maybe you were just going for the point of "selective sampling explains the complexity of some people's heuristics" in which case I would simply agree, and want to add that people generally avoid model complexity and prefer the buckets of "yes, no, and maybe".
> I stuck with my believe
Typo, should be "belief".