:prepares for comments section filled with people giving counterexamples for all the things Scott just said the media doesn't do:
I think part of the problem is that sufficiently advanced ability to find people who are lying and be unreasonably credulous to them is practically indistinguishable from lying - there's rules, but the rules have exploits so wide that one can reasonably call the game broken.
Man, every day I discover/someone points out some finely tuned heuristic I have running all the time.
A big part of the frustration of the moment is how much the rules of the game changed during the Trump presidency, at least for mainstream liberal media, and even for institutions. I thought I was reasonably familiar with the rules of the game, but things like the CDC statement on the BLM protests and the censorship of 'lab leak' theories caught me off guard – I would have previously said those weren't the kinds of lies to watch out for. It's been a difficult and frustrating adjustment.
Based on the first section, this fatally fails to distinguish between Fox commentary (i.e., Tucker Carlson, Hannity) and Fox NEWS. They are different. Indeed, that's how Tucker beat one lawsuit, by arguing that no reasonable person who see what he does as news.
"I believe that in some sense, the academic establishment will work to cover up facts that go against their political leanings. But the experts in the field won't lie directly. They don't go on TV and say "The science has spoken, and there is strong evidence that immigrants in Sweden don't commit more violent crime than natives"."
A possible exception to this rule: https://twitter.com/Telegraph/status/1481176891998490624
Emails from the start of the pandemic show that some of the leading scientists working on emerging viral diseases thought that a lab leak was reasonably likely but then they signed a letter in the Lancet saying the exact opposite (https://www.thelancet.com/journals/lancet/article/PIIS0140-6736(20)30418-9/fulltext).
Another great article. It seemed like you were humanizing our two political tribes to one another. A good cause!
These types of subtle nuances in 'bullshit-detection' calibration seem like a skill that is apparently very difficult for a lot of people to attain, and I'm thus pretty skeptical of any efforts to teach this kind of stuff in public schools in the form of 'media literacy' (although would love to see evidence to the contrary), it seems just as difficult to teach as it is to teach someone to be charismatic - a lot of subtle nuances that are hard to communicate as bullet points, and rather are best represented as complex multivariate distributions. I'm curious if anyone feels like they were 'taught' how to be really good at this by anything in particular.
Re the Swedish piece-they have rules about what kind of research is ethical, and you have to have certain kinds of permits. The argument is about whether the authors followed the rules. I think they did the follow the rules, and the prosecution is foolish-as the initial prosecutor did! For Scott to make this about "the establishment"--just really weird. Certainly doesn't make the point it seems he's trying to make.
"I think, I think, I think." You know, Scott, if you had even an iota of data here, instead of your unbounded faith in your own gut intuitions (aka priors), you might have something valuable here. All you're saying here is "If my unsubstantiated belief 1 is true, and unsubstantiated belief 2 is true, boy is that ever outrageous!"
For those interested:
I think any researcher who found that immigrants were great would not have the technicalities of their research subjected to this level of scrutiny, and that the permissioning system evolved partly out of a desire to be able to crush researchers in exactly these kinds of situations. I think this is a pretty common scenario, and part of a whole structure of norms and regulations that makes sure experts only produce research that favors one side of the political spectrum. So I think the outrage is justified, this is exactly what people mean when they accuse experts of being biased, and those accusations are completely true.
I am quite surprised that the first part of this essay does not mention the distinction between news articles and opinion editorials at all; after all, it seems highly relevant. So relevant, in fact, that it might make the rest of that section superfluous.
For instance, liberals might very well trust Fox News when it is purely reporting the facts on a story, since it is only a right-leaning media source, not an outright fake news website in the mold of InfoWars. But, in many spots, they might immediately dismiss any opinion piece as trying to push a narrative that their tribe rejects.
In fact, in many online (and real-life) discussions with liberals and leftists, one of the criticisms they levy against Fox News is that they intentionally mislabel some of their articles as news reporting, when in reality they are closer to opinion pieces, specifically so that readers are more likely to believe them.
And again, he fails to distinguish between opinion pieces in the Washington post and news articles.
A dimension of this mess that Scott is not touching on here is the whole „so who is an expert“ quagmire. Think of the Covid fiasco, and the plethora of „experts“ on all sorts of things it brought out of the woodwork. For people who are struggling with understanding a complex situation, it’s often not a trust the experts vs. distrust them situation: it’s „who are the experts in the first place“?
"That's probably a bigger lie (in some sense) then one extra mass shooting in a country with dozens of them"
"people can’t differentiate the many many cases where the news lies from them from the other set of cases where the news is not, at this moment, actively lying."
Should "from them" be "to them"?
It's not literally true that "experts in the field won't lie directly". There are two ways in which experts in the field will totally lie, and do so all the time. First, they'll be mistaken (maybe you don't count that as a lie, but from the point of view of an observer it can be functionally the same). For any proposition X, there's some distribution you get if you ask experts "how likely is it that X", and there'll be some (hopefully small) fraction of experts who are just wrong. Second, there's some fraction of experts who lack scruples. It can be a small fraction, I don't care, but it's nonzero, and so you can always find an expert to go on a podcast and blather any claims that you want.
This wouldn't matter, except now the other experts (who aren't grossly mistaken, and who have scruples) are likely to become in a sense complicit. "Not lying" is much easier than "calling out a lie". There are many reasons not to call out a lie --- political inconvenience, being associated with icky people who also call out the lie, not having enough time. People don't generally think (even if they claim otherwise) that there's some strong moral requirement to put your career on the line to correct some false statement made by a supposed "expert" in a paper, or online, or in the news. It's easy to justify inaction by saying "oh, the lie was of little consequence", without noticing how often that really means "of little consequence to *me and mine*".
The result of all of this is that if you consume the news, or the scientific literature, you can in fact be consuming outright lies. The small fraction who are grossly unethical, or outright stupid, make the lies (crossing the line!), and then others are reluctant to do anything about it (not crossing the line).
This doesn't invalidate the central idea of "bounded distrust". It's still the case that a sufficiently extreme lie ("the normal distribution posits that there is a normal human", or whatever that was) will receive substantial pushback --- although note that even there, people were reluctant to be associated with Razib Khan, and so took their names off of the petition! But this does move the invisible line of "things that are just not done" a bit further in the direction of dishonesty. What matters isn't so much what the median expert will *do* as what the median expert will *tolerate*.
From my viewpoint, it looks like the median expert will tolerate quite a lot of dishonesty, as long as it's "not of any consequence (for me and mine)". This varies by field, of course, as some fields have more of a culture of rudely calling out bad claims than others.
Other collective effects also reduce (in my eyes) the trustworthiness of amorphous "the experts". Just one example: who are the experts? Unscrupulous and incompetent researchers can create, by exploiting the politeness of their peers, an entire body of poor literature (here I'm thinking of "near-term quantum simulations", but there are plenty of others!). Now if I want to query the experts about this literature, who do I ask? The people who write papers about it? Not a good strategy, but it's very difficult to know who the correct expert to ask is. Should I ask "the inventor" of mRNA vaccines about the properties and effectiveness of the Pfizer/Moderna vaccines?
The upshot of all this is that if I have a friend who knows something about a field, I'm not particularly sensitive to all these collective effects, and I can reliably extract quite a bit of signal. If I'm relying on observations of the behavior and claims of "the experts" and "the journalists" and "the politicians", then even under optimistic assumptions about their individual honesty and competence, the amount of available signal is substantially reduced.
The concept of "everything" seems to very easily morph into the concept of "anything" in people's minds without them really noticing the difference, i.e. "You can't believe everything you read" becomes "You can't believe ANYTHING you read" and is defended with arguments that only support the former statement, not the latter.
'ambiently watching the TV at the gate.' I've never seen ambiently used like this, is it a typo or actually idiomatic?
Great article examining something common that usually isn't thought about explicitly. I think trust is in most situations contextual - I know people who I'd trust not to steal or lie but not to show up on time etc.
For the political implications, I think trust and power are closely connected, because in a sense if you trust some one you give them power over you, since they can then control what you believe, which will then influence the choices you make.
Where this gets dangerous is not so much people giving up and trusting no one. It's when someone comes along with the message "all institutions are bad, trust no-one but me" and people believe them.
Because at that point because trust=power, they have quite a lot of power. You could do almost whatever you want and people will still support you. For example you could say that you didn't lock up your political opponents - they committed crimes. Or that you didn't overturn the election - you just found fraud. Or that you didn't start the war, or the war was necessary.
With you mostly but I was waiting for you to acknowledge you were wrong about Ivermectin and why
That you didn't says you haven't moved with the times and although your general perception of the paradigm's workings are correct the specifics have changed and that is why you are still applying the old rules
See Adam Hill's Zoom today for example
I regard conspiracy theorists a bit differently. This theory basically says that media is an interpretive process, effectively an act of mutual interpretation between broadcaster and receiver. The broadcaster is trying to convey what they want the other person to believe. So far we agree. But you pose the receiver is trying to determine what is true and what is false in the broadcast. Conspiracy theorists are people who are doing this badly.
I don't think that's true. I think the receiver is trying to determine what they should personally do. They're not actually invested in truth or the institution of news. (I suppose this makes me overly cynical since it means NEITHER side is invested in truth.) For example, take the vaccine stuff. The news is trying to broadcast the message the vaccine is safe, necessary, etc in an attempt to get the person to take the vaccine. The receiver isn't fundamentally trying to determine whether any of this is true. They are trying to decide whether they will take the vaccine. Whether they should socially pressure other people to. And so on. Part of that is undoubtedly determining whether the news is telling the truth. For example, if the news reports the vaccine makes you grow wings and no one's growing wings then that's pretty relevant. But only a part and it's certainly not a necessary condition.
Once a person makes a decision they construct an epistemology that justifies this decision. Or alternatively they already have an epistemology and it creates the belief. That's complex. Regardless, this is true for both broadcaster and receiver. Conspiracy theorists are people who construct epistemologies focused around conscious deception (a conspiracy). Like most epistemologies it's communal rather than individual. This creates a social-cultural network/pattern. Which of course the broadcasters and non-conspiracy theorists have too.
The conspiracy theorist's centrally unfalsifiable claims is both powerful and handicapping. Because it's unfalsifiable and often totalizing ("everything is Illuminati!") it makes it difficult for them to effectively achieve their ends. Even when they win it often doesn't achieve what they want. On the other hand, this is an ideal way to spread and maintain itself. Someone with concrete goals ("get everyone vaccinated") must eventually come to their end. Someone with a vague unachievable goal ("eliminate the Illuminati") gets to flexibly gloss over policy details and apply their lens to every situation. And they never has to deal with the goal being achieved. Victories and defeats occur but never the ultimate victory or defeat. And this fight can pay pretty direct benefits to its members. Sometimes even on a society-wide scale.
In summary: Conspiracy theorists are not failing at being mainstream. They're succeeding at being conspiracy theorists.
For the record, while your overall point is interesting, the choice of examples (Fox News, immigrant rapists, ivermectin) is sort of annoying and leaves an aftertaste. Those topics are sort of emblematic of an intellectual niche which is, to be blunt, AMPLY covered by other outlets.
Most media is better interpreted as intended to entertain than inform.
I'm a professional media critic. My assumption from decades of close reading of the New York Times is that if I read a statement in the Times, it's very likely true. For example, if the New York Times tells me an Asian woman named Michelle Go was shoved to her death on the subway tracks by a man named Simon Martial, I'm sure that's true.
If the Times were to tell me Simon Martial is white, I'm sure they wouldn't be lying.
On the other hand, the Times finds some other facts are not fit to print. In particular, the Times does not like to go out of its way to raise doubts in the minds of its subscribers about their general picture of who are the Good Guys and who are the Bad Guys that they've developed over their years of relying on the Times for news.
Therefore, in both Times articles I've read that mentioned that victim Michelle Go is Asian did not mention the race of perp Simon Martial.
Coulter's Law states that if the news media report on an outrageous crime but don't let you know the race of the perp, he's usually black and almost never white.
More specifically, the Times has heavily promoted the theory that violence against Asians is due to Trump saying the words "China virus" a couple of years ago. This is a popular idea among The Times' paying subscribers. An alternative hypothesis is that misbehavior by blacks (e.g., shootings and car crashes) is way up since the mostly peaceful protests of the racial reckoning.
But most subscribers do not want to hear evidence for that. To even entertain that idea would raise serious questions about who exactly are the good guys: Is the Times itself a bad guy for promoting a bad idea -- Black Lives Matterism -- that has gotten thousands of incremental blacks killed violently since 5/25/20? Most of the Times' millions of subscribers are quite content with their notions of who are the good guys and who are the bad guys (Trump and Trump supporters) that they've derived from reading the Times and might not renew their subscriptions if the Times itself were to print more facts challenging the worldview the Times has inculcated in them.
But it's even more complicated than that: many Times reporters are excellent and would prefer to report the full story. So, what I've often noticed, is a frequent compromise between the marketing needs of the Times to not trouble subscribers with unwelcome facts and with the reporters' desires to publish interesting facts. Often, if you read NYT articles all the way to the end, you'll stumble in the later paragraphs upon subversive facts that, if you think carefully about their implications, undermine the impression the headline and opening paragraphs give. Of course, most subscribers have stopped reading by that point so they never notice.
I would like Lincoln more if he were friends with Marx. It would show he considered different opinions to his own and was humble enough to discuss ideas he disagreed with.
I disagree strongly with the characterization of the Swedish study. The study really did focus on immigration status as the most prominent result of the analysis.
In particular, Scott claims that, according to the linked article, immigration status was not "a particular focus of their study" and that "although it wasn't a headline in their results, you could use their study to determine that immigrants were responsible for a disproportionately high amount of rape in Sweden."
I went and looked up the original article and skimmed it. Here is the first paragraph of their results section:
Between the years 2000 and 2015, a total of 3 039 offenders were convicted of rape+ against a woman (Table 1). The majority of the offenders were men (n = 3 029; 99.7%) and the mean year of birth was 1976 (SD 12.3). Close to half of the offenders were born outside of Sweden (n = 1 451; 47.7%) followed by Swedish born offenders with Swedish born parents (n = 1 239; 40.8%). A relatively small part of the cohort was constituted of offenders being born in Sweden with at least one parent being born outside Sweden (n = 349; 11.5%). Table 2 shows from which regions the first- and second-generation immigrants and their parents originate from. Among Swedish born offenders with one parent born outside of Sweden (n = 172), the foreign-born parent was mostly born in Western Countries (72.7%) followed by Eastern Europe (11.0%). Regarding Swedish born offenders with no parent born in Sweden (n = 177), a high proportion of the mothers and fathers were born in Western countries (40.7% and 33.9%) followed by the Middle East/North Africa (19.8% and 24.0%). The largest group of the study population was found among offenders born outside of Sweden (n = 1 451); a significant part was from the Middle East/North Africa (34.5%) followed by Africa (19.1%)."
I think this is the definition of making something a headline of one's results. One of the most prominent pieces of information in the results is the breakdown of cases by immigration status. It specifically says that more offenders were born outside of Sweden than born in Sweden to Swedish parents.
It looks like this mischaracterization was not present in the news article that Scott linked, which discusses this research paper. In that news article, they specify (with quotes from the authors) that the original purpose of the research was not to focus on immigration status, but that it was something they discovered by chance while doing the research. In particular, the claim that immigration status wasn't a headline of their results seems to have been introduced by Scott.
I don't know whether Scott had access to the original research paper - I couldn't find a freely available copy of it. However, this same highlighting that I quoted above is also present in the abstract of the paper, which is freely available. Here's the relevant content from the abstract:
"A total of 3 039 offenders were included in the analysis. A majority of them were immigrants (n = 1 800; 59.3%) of which a majority (n = 1 451; 47.8%) were born outside of Sweden."
The abstract is freely available here: https://lup.lub.lu.se/search/publication/e2c65632-50e1-4741-a1b3-21664eaf7724
I don't disagree with Scott's overall point, which is that the researchers face repercussions for their findings, repercussions that they likely would not have faced had they found the inverse conclusions. But I strongly disagree with the implication that one would have to go out of one's way to use their study to determine that immigrants were convicted of rape at a disproportionate rate. It makes it sound like the scientific establishment is raking through papers only tangentially related to this topic to find people to crush, and that's just not what happened.
In a way, finding this small but important inaccuracy in this essay drives home the overall point of this essay. It's necessary to distrust every source, to the extent that they're willing to stretch things or not double check things or generally be unreliable. And that applies to this essay as well.
As someone who is a journalist and a fairly close follower of Swedish debates on crime and immigration (I lived there for seven years, for three of them as a working class immigrant in what is now a ghetto but then wasn't), I think Scott is 90% right, but missing one important journalistic skill, which is that we know which experts to trust, and how much to calibrate in each case what you might call the Pravda factor.
You have to remember that no expert or insider will tell the whole unvarnished truth in public except in very rare cases. This is normal and natural. Either they will be misunderstood, usually deliberately and often by their own side, or they will be ignored.
But it you're lucky, and have something to trade, and if they have learned that they can trust you, they will talk much more honestly in private. Given that the Swedish debate about immigration and crime is so inflamed, and the public story so very different from the things people assume in real life, the first thing I'd do is ring up a criminologist friend and ask if this story is bullshit. That would be off the record and it would have to be. Unless they felt there was a huge injustice going on, taking sides publicly would be as pointless as joining in a twitter spat.
What they told me would feed into what I then wrote. But now we're up into double layers of trust. The reader has to trust that I have a trustworthy source. Why should they? Readers don't on the whole interact with individual bylines enough to establish a relationship of mutual trust. So Scott's original heuristic is about right.
But it does lead to a genuinely damaging situation in which (to speak from experience) a Guardian executive will say "We can't use that quote because the Mail would love it". And, presumably, vice versa.
Some lines the media already crosses are pretty far out there. No idea about FOX, but I do know that France 24 or TV5 Monde can quote a foreign figure saying X and "translate" that as saying the opposite of X. Or show footage of NATO tanks and imply those are from a non-NATO country. Source: a relative who watches France 24 and TV5 Monde and knows both languages.
This would be a lot more convincing as an argument if the media didn’t frequently make up events that never happened or lie about them in hugely significant ways. You e covered many such events in the past in your blog as well, which makes me wonder if those all somehow fall into lies no smart person was expected to believe, even if those lies launched wars where millions of people died.
Recently Rolling Stone Magazine made up a story about ivermectin poisoning cases causing gun shot victims to be unable to get into hospitals. This was picked up and repeated widely in the left media despite having no basis in reality.
Not to mention the clear lie about horse dewormer which is utter nonsense concerning a drug on the WHO essential medicines list whose finder won a Nobel prize for its discovery and is an approved drug for humans in every single developed country in the world….are you telling me they didn’t know that they were lying as they cashed those Pfizer advertising checks and listened to their board of directors, some of whom also sit on Pfizer’s board? Is this outright and intentional lie equivalent to making up false footage of a mass shooting …is this not the sort of totally made up nonsense reporting you’re talking about them not doing! Because they are doing it anytime they feel like it.
And what of the evidence free Russiagate story while they ignore what google and Facebook have done to interfere in the election?
What about something simple like how Rodney King’s name isn’t Rodney King? They couldn’t get his name right in the reporting and stick with their error.
These are higher profile cases and lend themselves towards controversy, but even with various other stories reality gets twisted to such a degree thst it may as well be made up. Anyone I’ve met who has been part of a news story has said that the reporting was a lie and a misrepresentation of what went on.
Science reporting is a favourite point of malfeasance and making up nonsense to pretend s study says the opposite of what it actually says. Often with the scientist telling the reporter over and over again that they’re wrong. There is no world in which those science journalists don’t know the truth and limits of the study when they’ve spoke to the corresponding author, yet they’ll just make up lies as they see fit and they do it on purpose.
Is this 100% exactly the same thing as making up a false citation to then say whatever they wanted to say? No, it isn’t, but I fail to see a difference. If you can say what we you want, then the base reality event is just random cannon fodder for the lie machine, even if you can squint at the gruesome chunks of flesh and occasionally make a guess at reality. If you want to say up is down, it isn’t hard to find some loosely related ‘up type’ event in reality you can twist.
Your own experience with the NYT is proof enough of s low stakes case where pigheaded reporter and editors simply wish for whatever reality they want snd do whatever they want with information.
Does it matter if half the media lie and the others do not? Is it somehow better if both fox and msnbc go along with Bush and Powell about nonsense yellow cake and connections to 9.11?
Was a reporter not caught on a hot mic talking about how she has the Epstein sroey years ahead of time and had to suppress the story because of management who didn’t want to get cut out from the royal baby wedding coverage? Or how they all lie about his ‘suicide’ that was clearly not a suicide? We just nod along and go yes yes..we smart folk know he was obviously some intelligence agent taken out by his handlers.
In any given year many false, made up, and non factual stories run and they’ll be a mix of them doing these things on purpose, simply picking up propaganda and running with it, and the mightiest tool of censorship being non coverage of stories to devalue them. Along with early reports, rushing, and just plain being wrong.
But they definitely make up things out of thin air when base reality fails to provide them an excuse they can use to say something else entirely,
This is an interesting blog post. It also has theoretical potential, when it comes to elaborate and fine-tune a theory of human interaction as such.
You are essentially at the intersection between signalling theory and semiotics. Which in my humble opinion is “where the action is” in the human sciences today (and probably in the life sciences more generally).
What you are describing is the way principals (defined as actors in a coarser information position, in this case: those who read and watch the media) try to screen messages and signals from agents (defined as actors in a finer information position, in this case: journalists and editors belonging to different news outlets) to detect which agents are trustworthy/who to trust.
... Journalists & editors send messages and signals in order to come across as trustworthy. Users of media (the rest of us) try to screen these signals and messages in order to determine who we can trust, and who not to; including when we can trust messages sent by those we normally do not trust, and when to be sceptical toward messages by those we normally trust.
One of the (interesting) points in you blog post is that some principals are better at screening such messages and signals than others, including that those who are less good may (rationally) adopt cruder strategies in lieu of fine-tuned screening abilities, such as “trust nothing from news source X”, or “trust nothing except from your close circle of friends and relatives”. And then you try to suggest some kind of demarcation criterion to use, to improve your screening skills. Again, theoretically interesting – and of applied interest as well!
…you might also have the embryo here, of a strategy of how one might potentially establish what a commenter to a previous blog post labelled “the inner party”; i.e. how to solve the very difficult problem of creating a circle of people who are able to subtly signal what to believe and what not to believe to each other, while at the same time being able to collectively maintain signalling something different to the “masses” (the great unwashed). Re: you story about good versus glorious harvests in good-old USSR-time Pravda. Hmmm…
…Some classic essays and articles come to mind here: “Trust in signs” by Bacharach and Gambetta; “Trust as a commodity” by Dasgupta; “Strategic interaction” by Goffman.
Elaborating this type of insights into fine-tuning a general theory of signalling & semiotics, I suggest the following one-liner as the overarching premise for this general theory: “We are all principals when observing others, and we are all agents in the eyes of others”. Meaning that “we are all in a coarser information position when observing others, and we are all in a finer information position when being observed by others….”
Good in general but totally fucked about the 2020 election because the WaPo and similar sources were not in a position to KNOW whether there had been well-covered-up fraud AND THEY DIDN’T WANT TO KNOW.
No need to get into the details here to try to persuade you about fraud in the 2020 election, just telling you that it is a really bad illustrative example. Furthermore this IS the kind of thing they would lie about for the same reasons they spiked the Hunter Biden stories they knew were probably true.
This is my explanation for why Zeynep Tufekci, a sociologist who studied the role of social media in real-world phenomenon, turned into one of the most prescient COVID pundits from the very start: she has no microbiology background, but she has a finely-tuned sense of who is playing politics with the truth, and which ideas are being brushed aside for reasons besides validity.
News media outlets have a lot of discretion over what is news and what is not news. Obviously, wars, stock market crashes, blizzards, etc. are going to make the newspaper. But in a country of 330,000,000 people there is always more potential news to report upon than there is space for it, so judgments must be made.
For example, the New York Times, which traditionally strongly influences the rest of the news media, finds the rather dusty story of Emmett Till, a black youth who was murdered in 1955 by whites, to be worthy of constant coverage. The name "Emmett Till" was mentioned in the NYT in 57 different articles in the last 52 weeks, and in 407 articles since 2013.
The once-a-week invocation of Emmett Till serves the Times' purpose of encouraging readers to believe the Narrative that blacks are in grave danger of being murdered by whites. Granted, somebody with good critical thinking skills might notice that if you have to keep bringing up a 67 year old incident to serve as an example of your statistical hypothesis, you might not actually have a strong case. But most New York Times readers are more in tune with the mood music than with the data.
In contrast, the New York Times does not much at all like to report on black-on-white violence, treating it as distasteful police blotter items of only local interest. Not surprisingly, readers of the national news thus tend to get a highly lopsided and biased view of the criminal justice system, with disastrous consequences, such as the historic increases in murders and traffic deaths since the declaration of the racial reckoning two years ago.
My mental model of “lying” is the distance between what someone is saying and what they consciously attribute truth to. The longer that distance, the more lieness they have. If they just refuse to come to a conclusion, to cheat and jam the lieness calculator, I give them an automatic 30% lieness score with an “undecided” annotation.
So: “ Really savvy people go through life rarely ever hearing the government or establishment lie to them. Yes, sometimes false words come out of their mouths. But as Dan Quayle put it:
Our party has been accused of fooling the public by calling tax increases 'revenue enhancement'. Not so. No one was fooled.”
I like this, but there’s a missing piece. Substitute “fooled” with “betrayed.” Quayle says no one was betrayed because everyone understood the code. But collectively there was a betrayal of the information transfer process, by use of obfuscation. Obfuscation is always a tiny bit extra effort because it has to tilt the preferred direction. So there was effort made to present something not congruent with Quayle’s personal attribution. That’s a higher lieness score. Zero-consequence obfuscation is not a thing; if it wasn’t accomplishing something they wouldn’t do it. Maybe it was finessing attention away from the topic, make it slide by unnoticed, so whoever bothered to think about it would crack the code and not be betrayed, but more people would simply not notice?
So “really savvy people” are not experiencing constant betrayal, because they both pay attention and know the code. They may be able to change with the conditions and not sustain harm. But the lieness score for Quayle is still nonzero.
Unwillingness to score someone on lieness is not necessarily gullibility, but it is unwillingness; if I’m not betrayed either way, surely I can look at the nonzero lieness score?
“Clueless” may struggle to distinguish the code from the lieness. It may cobble together into a “likelihood of betrayal” score.
In fact 70% of people thought Saddam directly responsible for 9/11, not just that he had WMD. That he had WMD was not a fabrication of the media but the political establishment- that or they actually believed it. The media was reporting what the administration said. I don’t remember there being much opposition to the idea that Iraq probably had chemical weapons, outside left wing anti war journalism.
Calls for invading Iraq started pretty soon after 9/11 and the media did conflate Iraq and the event pretty soon after. Bush apparently blamed Iraq within 3 days, the media implied or said outright that Iraq was responsible directly, or that Iraq helped Al Queda. 82% believed the latter. This couldn’t have just been the right wing media, it’s reach is not far enough.
Why would the government saying that the harvest will be good instead of glorious mean the harvest will be bad?
In my experience it’s clueless people who end up being the gullible ones when, in the throws of their paranoia (fear), they fall for a conspiracy theory or magical religious thinking. They want to believe it. Those types are perfect marks for grifters/scammers.
For the Lincoln example, you can argue that the journalists *know* most people don't read past the headline. So the speculative piece was an excuse for push the myth Abraham Lincoln was into Marx in the headline. But I accept the wider point.
For the science case. If you take something like the causes of Autism, the public have a great interest in it but are led to believe its just some random great mystery. The actual science is in the position of now making some definitive statements about likelihood. But none of this is propagated to the public lest it make unsavoury geneticists look correct.
This is a great post. It would be even better if it explicitly acknowledged that the rules change, and that we are living in a time in which that change is rapid.
Rapid change should, and often does, undermine people's confidence in their ability to discern what is true from what is reported.
I'd highlight one particular change as having been explicitly planned and having backfired spectacularly:
Before Trump, most quality media organizations were committed to reporting on events neutrally. They always presented both sides of the argument, and avoided drawing conclusions.
The argument was then made that if one side is lying through their teeth and the other is telling the truth that this approach may serve to mislead more than inform. This sounded emminently reasonable to me in theory, and it came to pass.
Unfortunately, it has not worked out very well in practice. Being freed from presenting the other side's arguments has led to a great deal of disinformation and severely compromised my default trust level in articles appearing in the New York Times and Washington Post.
I suppose this is better than most such changes, in the sense that it was at least explicitly discussed and thought about.
Or maybe not. Maybe this illustrates how little value explicit discussion actually has, since our collective wisdom is insufficient to avoid serious harms.
That said, I think it's fair to say "this is a game I'm not interested in playing". That's my stance. I feel confident being able to tell the difference between the cases more often than not, but since most of the news is not interesting to me anyway, I just don't expose myself to it. I don't need to constantly worry about getting it wrong in the edge cases and waste brain cycles on that.
Given the rare scenario someone wants my opinion on something from the news, I can offer my abridged first impression thoughts based on their summary with disclaimers, or I can dig in then. This has been working well for me, but whether it does is necessarily dependent on one's social circle. (There are some where even the disclaimer "I don't actually know anything about this yet, but from what you've told me," might prompt outrage.)
But I think a lot of people who distrust the news distrust it for the stories it *doesn't* tell. For example, my instinct on reading this article's first summary on the Lincoln/Marx topic was "and how many friends did Lincoln have? Is there evidence he favoured Marx's views any more than some others?".
Similarly, when the news tells me, for example, about some bad thing [big corporation] did, but doesn't let them speak up, I wonder if the corporation has an actual reasonable justification for their actions that's being swept under the rug (sometimes they do, sometimes they don't). Same with political parties, nation states, et cetera.
And I suppose sometimes they also do just screw up and "lie", but I honestly get the impression that's just because humans are involved and humans sometimes make mistakes - it's typically not an attempt at fabricating facts. (Granted, that might be an observation true for the Tagesthemen in Germany, on which I'm basing most of my opinions, who seem to at least *want* to take journalism seriously. Fact-checking can be hard, even for big players, but it's a very, very rare event that they screw it up completely.)
See also Scott Lawrence's comment, which gets into that failure mode.
Grammar nitpick: “then one extra mass shooting” -> “than”
Didn't Scott already write this essay? I think it might have been part of a much longer essay on another subject, and in the other version it suggested that middle class people, being one step closer to, and hence having a better mental model of, the sort of people who actually have power, are better at sorting the lies from the not-quite-lies.
To pick on the examples, though, I think you have far more faith than I do in the Washington Post's reporting on election fraud. I'm not saying that there necessarily _was_ massive fraud, but I can't see any mechanism by which the WaPo would be inclined to look into whether there was; as an organisation, the Washington Post had a fundamental incuriosity about any story that might help Trump (what _was_ the deal with those Hunter Biden emails anyway?), so they have no more interest in finding out whether there was electoral fraud than the Swedish government has in finding out whether immigrants commit more rapes.
The linked article is a perfect example of why I can't trust the WaPo's reporting on this subject, it's incredibly disingenuous. As slam-dunk proof of the paucity of fraud, it offers the fact that only a small number of double votes were found... but double voting is the dumbest and most blatant form of electoral fraud there is; I'd like to know how many mail-in ballots were stolen, either before or after delivery, and how many ballots were "harvested" in suspicious circumstances.
This article is the equivalent of "Kangaroos don't exist, I checked my back yard and my front yard and didn't find any".
Or of course you could just learn something about the underlying measureable facts of the situation, and be able to judge when the experts (or politicians) are shading and when they're being straightforward -- on a sound *empirical* basis, and not via either the amateur social psychology you hopefully picked up in your mother's milk and/or School o' Hard Knocks, or via a Jesuitical parsing of the exact linguistics.
I mean, this is what we do elsewhere. If I want to know which financial pundits are lying through their teeth, the best advice is to to dig in and learn something about finance, stocks, options, et cetera, master the vocabulary and math, and start paying attention to ticker symbols. Basing some critical judgment on the social psychology of journalists, or an elegant reading between the lines of their prose, is a very poor second best.
 Spoiler alert: all of them.
FWIW, Samnytt.se (the news outlet referred to in the "immigrants' crime in Sweden" part) is basically a Swedish version of Breitbart News. Radically right-wing, racist and with a VERY lax view on journalistic integrity and – which the entire blog post is about – the truth.
Ten years ago, during the whole muslim psychosis, I do remember this sort of whole cloth lying a lot, though.
Does anybody else remeber all those stories about "no-go-areas" in Europe? All the while there were Europeans *living in those very areas* on the internet yelling that this was crazy?
Excellently written. Your gift is appreciated. Reminds me of my youth when George Bush Sr. laid out his doctrine on a New World Order. It was like God had finally spoken…then my grandfather educated me about the use of New World Order in history. Damn.
A quick note on the recent, raging "Expert Failure" debate.
It appears to me that the experts have suffered a corruption of the systems they are a part of. Without getting into the weeds on what corruption means in this context, let's say that the reputational risk:reward on honest communication has become such that honesty is heavily disincented. Some of us have heard countless examples of "behind closed doors, my expert friends say so and so, but they wouldn't dare say it publicly" in the last couple of years.
So I suggest a solution to this: how about anonymized expert networks? This way, we get to hear from the experts, without any risks to the experts. Kind of like the semi-dark expert networks that private equity shops heavily lean on.
Similar to Metaculus, but with (an apolitical, test-based) screening for expertise and a focus on deep insights versus predictions. Would be nice if Bill Gates or some billionaire would set it up and provide compensation to the experts.
In our desperate search for truth in a post-truth world, filtering for expertise and adding anonymity may get us closer.
I wonder if Web 30.0 will have a solution for this
What makes it even harder to "know the game", is that it is not just one game, but that every scientific community develops their own rules. Climate scientists follow a pretty different set of rules than neuroscientists. If you are savvy enough to read papers from climate scientists, that does not make you savvy enough to read papers from neuroscientists.
And of course, all the same for journalism. Tabloids follow different rules than broadsheets. The science part of a newspaper follows other rules than the politics part or the sports part.
Being savvy includes knowing which articles and statements you can interpret right, and which ones you can't.
This is one of those posts I can tell are true and important because it leaves me totally uncomfortable and unsatisfied.
Those interested in a rigorous look at immigration as it relates to sexual criminality in Europe should read Ayaan Hirsi Ali's latest book, Prey: Immigration, Islam, and the Erosion of Women’s Rights.
I agree with Scott that it is a really important skill to "bound your (dis)trust" when interpreting public statements (or your friend Tina when she says the food at this new restaurant is great and you should go there).
What I disagree with is the sense I get from the article that this is a binary skill (you get it or you don't). I think this is a very hard task, everybody struggles with it to some degree and it is often impossible to figure out what's the right amount to trust (or what the exact bias is). Your own priors will also determine how much you should trust someone or what to take away from the statement. Really, its just a special application of Bayesian updating and we know how easy that is in practice.
Case in point, I think "the WP says the election was fair" should be compared to "Saddam has WMDs" rather than "mass shooting in NY". Why? Because these are the two cases where the media coverage, to a first approximation, can be explained by the fact that it repeats the official governmental position on issues where the media have a lot harder time if they wanted to endorse a different position (much like the problems the swedish immigration crime rate study experienced). So if somebody is convinced that the election was rigged despite all official bodies saying it wasn't, the WP article isn't going to change their mind based on bounded distrust.
I would take a different lesson from the examples you provide:
1. School shooting- All news portals would say that the same person killed the same number of people at the same school. There are very few variables, like the number of victims, name of the killer, etc, and the values of these variables are not open to interpretation. You cannot say that Abdullah looked like a John
2. Election malfeasance- The only direct variable involved is "Election fair=True/False". This variable is impossible to measure directly. Hence, either party is free to choose other variables that are indicative of the value of the direct variable. For example, Fox News might choose "trends in past elections in swing states" and say that time-honored trends were not followed in 2020, indicating that the election was not fair. Washington Post might contradict this analysis, and so on. It is only when direct variables cannot be measured, and we have to study indirect variables that we are free to choose in the manner of p-hacking, that news becomes open to interpretation.
I take your point about experts not willing to sign a petition with false claim. But this is manifestly untrue when the issues involved are political. Middle school education experts sign petitions with the claim that more funding poured into middle school education drastically improves the education outcomes of students independent of IQ. Also, Bill Clinton, the expert at having an affair with Monica Lewinsky, lied about his affair with Monica Lewinsky. Self-interest can muddy the waters significantly for experts even when talking about their own fields.
This is great, and I plan to share this with friends.
I would be very happy if I felt that most blue tribe people would agree with this. But my experience is that no, they won’t. They get angry and mad if I say that the New York Times isn’t really reliable source because of biases.
This is partly a test. This post and the more recent one about poverty and EEGs won't load on Chrome. At first, they would load and then quickly switch to "too many requests". Now this one will load from Opera but not Chrome.
As for the topic, this isn't just about news source, it's also about cancel culture, both right and left, which are based on deciding that some source is completely disposable.
"They don't talk about the "strong scientific consensus against immigrant criminality". They occasionally try to punish people who bring this up, but they won't call them "science deniers"."
The first statement is true, the second statement seems ~ false. When people bring up similar results they are often accused of "peddling pseudo-science", which seems functionally analogous to "science deniers." If someone asserts 'immigrants commit a disproportionately large amount of crime' they wouldn't be called "science deniers" only because it doesn't generally make sense to call someone a denier for asserting a positive claim.
I think the issue I have reading this is the impression I get that you, writing it, and everybody reading it, is going to think "Oh, I'm smart enough to have the correct level of distrust". It smells like just-world theory, only about intelligence instead of morality.
"While I've heard rare stories of the media jumping in too early to identify a suspect, "the police have apprehended" seems like a pretty objective statement."
The irony is that this literally happened last week with Malik Faisal Akram in Texas.
I understand the point being made, but I think part of our current problem (with vaccines) has to do with how much trust the pharmaceutical industry has burned in the past 25-30 years. There are some few people who won't get the shot as a political marker, but there are others that look back to the opioid epidemic and the hand-in-glove relationship with our regulators and the hair raises on the back of their neck.
This article is probably the main reason I read Scott. I identified that he has this skill much better than I do, and would never intentionally state something he knows to be certainly false. If there is a chance of it being true or false, he used qualifying words or even percentage guesses. And he rates his guesses at least annually, to calibrate himself. Currently, I cannot name any other source that is both better at this skill than Scott, and this level of honest. If anyone else has suggestions, that would be interesting though.
You made similar point in another post: “it’s not that bad if experts get things about Covid right two weeks later than MTG players”. Yes they are like 2 weeks or 2 years (as with N95s) late on Covid. But on other topics they are decades late and not catching up. For example beliefs about education and signalling. You compared schools to child prisons, this is position comparable in its anti-expertise to denial of anthropogenic influence on global warming.
Two things: (1) Media bias isn't that hard to correct for, for people who have an interest in doing so. But the supply of biased stories is created by a demand for bias. Most people have no desire to correct for bias. The reading skills and thinking skills you discuss here (as with the Lincoln-Marx example and the government harvest prediction) are not nearly as hard as you make out for people motivated to suss out bias. (2) I'm usually more concerned about ignorance in news stories, which may or may not be filtered through bias in the experts chosen to ornament a story. My standard method is to pick stories in some given outlet which are about something in which you know more than the journalists. They almost always get it wrong. That usually isn't bias (although they may be biased as well.) They just aren't trained in whatever the subject is. You can use the difference between what you already know and what they report as Bayesian prior evidence for future stories in which you have no specific domain knowledge.
Great piece. The screening ability may be somewhat (ha ha) rarer than Scott suggests. Wonder if changes in education over the past 20 yrs have affected the prevalence of the skill. Did no child left behind reduce or enhance adults’ truth detecting intuition?
The AIER article on Lincoln & Marx was written by Philip Magness, who regularly mocks anyone who takes Chinese data on COVID deaths/cases seriously. In contrast, Greg Cochran (of "creepy oracular powers" fame) realized how seriously COVID was based on the Chinese government's reaction, and regularly mocks COVID skeptics (whom he's also won multiple bets against) for thinking that large numbers of dead bodies are the kind of thing that could be easily disguised.
Deleted my previous posts after I thought about this a bit more. The belief that "media literacy" is a skill seems to rest on a flimsy assumption: That biased journalists/experts are writing in a secret code where 99% of people will read it and believe a lie, but the truly smart people will decode the article correctly and find the truth. There's no reason to believe this always holds, even if it holds sometimes. A journalist with a single stroke of a pen could change the article so that there is no way to get at the truth. An organization could watch people successfully "decoding" the articles on Twitter, and adjust their writing style so that using the same decoder "key" will uncover another lie. It does not seem worth it to try to engage people who are writing in bad faith in such a way out of a belief that somewhere in there is a tiny speck of good faith.
Concisely stated, the rule would be to always believe extremely objective statements. As a corollary, somebody isn't making objective statements (aka weasel words) just ignore them. Good heuristic if you can use it (sometimes weasel words are just too common).
While I generally agree with that, it means scientific frauds can go for awhile without being discovered.
People close to him knew he was a fraud (source: somebody who was close to him told me) but Bell Labs management didn't want to believe that and we typically accept the raw data from scientists as true; peer review is about checking methods but assumes good faith. Schon was caught because he had reused figures in ways that couldn't possibly be anything but fraud. If he had been less lazy he might not have been caught. Makes one wonder how many uncaught frauds are out there.
It would be easy to write this off as an outlier, except outliers can have extreme effects.
The analogy to news might be using photos at the top of an article which are true but wildly misleading, something which happens all the time. Stock photos to set the scene are harmless, but often people cross that line too.
I think this is a true and important post.
One extension I would like to add is this also applies to other matters like trusting governments (ie China). I was surprised to learn over the last few years a lot of people lack the skill to appreciate the scope and types of things the Chinese government can lie about.
I think Scott’s core point is trivially true—there is some core of objective fact to most (not all) media accounts that very few journalists or experts would actually lie about—but the larger post seems like a bit of a motte-and-bailey argument. Those of us who are radical media skeptics don’t take issue with that proposition in theory, we just think that category is much smaller and far less significant than the post implies, and that what gets reported in the first instance and what does not is carefully curated, and that selective reporting and omission of important context eliminates any usefulness of the media in all but the lowest common denominator sense (i.e. I believe that if the media reports a demonstration in my part of town, I can reliably predict increased traffic and logistical difficulty in that area). Scott’s post suggests it’s possible for the sophisticated to extract more useful signal than that; I think that’s cope by those who don’t want to acknowledge how bad the situation is or are concerned about the consequences of enough people thinking like that. But that’s, like, just my opinion, man.
The discussion of particulars in the comments here seems to kind of miss the point a little. The real issue is how much you should update your views based on even true information provided by those with the ability, incentive, and stated intention to selectively present such information to you. I think the answer is “not very much” and that goes to zero if a particular event is already in your “this happens sometimes” category.
There's a set of folks that I'll just call "epistemic institutionalists" [I've heard the term intellectual authoritarian used but the 'A' word has negative connotations]
i.e. the idea that the responsible thing is to simply teach someone to be capable of identifying the institution that promulgates a fact/set of facts/narrative and either trust or dismiss what is said without actually sifting through the contents for value.
Now, the kind of skill that Scott is describing is basically that of an extremely high reading level which is at times combined with varying degrees of statistical literacy. There's also implicitly a certain temperament (emotional detachment) required but let's waive that for a moment.
The idea that under even the most ideal circumstances a huge portion of the population is not going to attain a particularly advanced reading level and/or mathematical aptitude, (perhaps because some underlying physiological trait that can't be significantly enhanced through environmental or medical stimulus) Is *extremely* taboo with the aforementioned institutionalists.
If you're an institutionalist you more or less have custody of the youth's instruction from the ages of 7 to 18 and beyond. If you decide what you'll do with that time is to drill your students in trusting and dismissing sources out of hand you're more or less operating on the assumption that the vast majority will never attain the level of skill needed to do what Scott describes.
Perhaps experts don't lie about immigrants committing more crimes than natives, but I think they come pretty close. E.g. in https://www.svt.se/nyheter/inrikes/kriminologen-jerzy-sarnecki-las-in-unga-valdsbrottslingar-lange Sweden's most prominent (in mainstream media) criminologist says that immigration has not increased the amount of violent crime in Sweden. It is very difficult for me to believe that this expert truly believes that in the absence of large-scale migration of exotic peoples, Sweden's gang-rape statistics would have looked the same.
This is, more or less, how I learned about covid in November 2019. I specifically follow right wing news aggregators because I see things I don’t see in left wing/mainstream news aggregators. Enough of it checked out and the way they presented it wasn’t the way I would have expected them to present it if it was entirely fiction.
> There are lines they'll cross, and other lines they won't cross.
Good post, but I think the real is that they're crossing that line because it confers some advantage to them, and that's likely because they *know* some people will misinterpret it. That's clear deception no matter how you slice it, and therefore we have to ask ourselves whether we should tolerate line crossing at all.
For anyone looking for more examples of this reading-between-the-lines process, it is sometimes referred to metaphorically as Kremlinology.
I don't know if links will get blocked, but for anyone else looking for the a link to the study (rather than a news source's review of the study) it is here: https://www-tandfonline-com.translate.goog/doi/full/10.1080/20961790.2020.1868681?_x_tr_sl=sv&_x_tr_tl=en&_x_tr_hl=en-GB&_x_tr_pto=sc
If I were to try and summarize it I'd say - they used statistical tools that don't start with a hypothesis - so immigration nor anything else was a "particular focus" (although presumably they assumed that some of the data they'd fed into the model would turn out to be relevant). However after the statistical tools were run, in their own abstract they say that a "key point" is "The majority of those convicted of rape are immigrants."
> What’s the flipped version of this scenario for the other political tribe? Here’s a Washington Post article saying that Abraham Lincoln was friends with Karl Marx and admired his socialist theories.
A better comp to the Fox scenario you describe would be just over a week ago several liberal outlets describing the Texas synagogue hostage taker as a "British man" and then covering a press conference where some fed said something to the effect "doesn't appear this had anything to do with Jewish people." Ok then!
> The 2020 election got massive scrutiny from every major institution.
The 2020 election was highly irregular and far from receiving scrutiny, every major institution continues to refer to it as the freest and fairest election in our nation's history. Including evil Fox News. Doesn't mean it was rigged but it received anything but "massive scrutiny."
> They occasionally try to punish people who bring this up, but they won't call them "science deniers".
Too good. "You may violate our women, suppress dissent, and make a mockery of any concept of democratic governance. But don't you dare call use science deniers." How to control a rationalist with this one easy trick!
> The reason why there’s no giant petition signed by every respectable criminologist and criminological organization saying Swedish immigrants don’t commit more violent crime than natives is because experts aren’t quite biased enough to sign a transparently false statement..
Why bother signing a transparently false statement when the government can just bring "misconduct" charges against them (only when results lead in a certain direction, as you point out)? As for the climate change example: a global phenomenon with global funding sources. No one government can silence dissenting voices so it is left to deep pockets to offer the carrot. Not saying the climatologist letter is wrong or a lie, but it's a poor comparable for the Swedish immigration study
This was a lot of words to sorta, kinda admit you were wrong about ivermectin. And all the reading of tea leaves and "sensing of the dynamics" does not account for an obviously coordinated effort to suppress and discredit a generic drug that at the very worst does no harm. From a bizarre media blitz of calling it "horse dewormer" to doctors losing their medical license for prescribing it and pharmacies refusing to fill it. This is unexplainable behavior, and so, they don't explain it. The harvest was meh this year, comrade. Pay no attention to those Golden Arches. Only horses eat there.
Speaking of bias, you can misrepresent a situation while doing nothing but presenting the truth. Imagine, for example, that FOX News took it on to report *every* case of a major crime committed by an illegal immigrant. They could have teams of investigators on every case, outclassing the police, reporting nothing that wasn't sourced to objective evidence or at least 3 separate, credible eye-witnesses. And it would still be a misrepresentation because it was placing undue emphasis on one group of people.
As always, thank you for your thoughtful consideration of important and complicated issues. I have a quibble:
"I think some people are able to figure out these rules and feel comfortable with them, and other people can’t and end up as conspiracy theorists."
That's a false binary; you present it as 'well-adjusted people who understand nuance vs q-anon nutters' when there are at least three categories. I'd argue that the people who blithely believe Fauci is a hero who Embodies Science and buy t-shirts and devotional candles with his likeness do MORE damage than people who think Fauci engineered covid so that Bill Gates could distribute his 5G chips. On the other side of the spectrum from Alex Jones are people who ALSO aren't able to 'figure out these rules and feel comfortable with them' ... rather than misinterpret the meaning of the game, they *fail to see the game at all*, and then fall victim to the same tribalist comfort of unthinkingly belonging to a team.
The middle, rational position, isn't to 'feel comfortable' with the game - that's way too much like not seeing it at all - but to simply realize that everyone, to some degree or other, is lying to you, *all the time*, and act accordingly.
I think this piece overstates the importance and implications of "FOX wouldn't make up the fact that there was an act of Islamic terrorism". Let's say that's true. But still, if they choose to report on 30 true acts of Islamic terrorism and choose not to report on 300 true acts of non-Islamic terrorism, what meaningful knowledge do you gain from correctly understanding that they wouldn't make up the 30 that they did report on? Same thing on the left, if CNN reports that there were 24 new laws that limit people's ability to vote, and they wouldn't and didn't make that up, but they also don't mention that there were 50 new laws that enhance people's ability to vote, what important thing have you learned from correctly understanding that they wouldn't outright lie about the 24?
There's another possibility: you think you understand the rules of the game but you don't really. I think this could well put you in the worst position of all. To be specific, I find it very hard to square the early statements on the possibility of a lab leak with the rule that scientists won't "flatly assert a clear specific fact which isn’t true".
The AEIR rebuttal has its share of seemingly calculated half-truths and insinuations. For example, "But Marx’s articles for [the New York Tribune] consisted of brief news summaries about the Crimean War, continental European politics, and piles of dry filler material about annual crop yields and industry reports. Only a small minority of these works ventured into something resembling a cohesive Marxian economic theory"—that last part is unsurprisingly true, since there wasn't yet such a thing—but his column seems characteristically to me unusually passionate and ideological. The author even generously links to those articles. Did he not expect anyone to check?
I feel there are two orthogonal layers here.
A) Status signalling
B) Deceiving v truth
Generally deceipt as opposed to lying because they technically don't lie but still with the mens rea of giving people a false impression.
Most comments seem to focus on the deception part, but I don't feel it is the main driver here. It mainly seems to be a complex game where high status try and trip up low status people, so they can laugh at them as conspiracy theorists or people who don't understand the elite terminology.
The B) section is quite blatant once you penetrate , the Swedish government and Marxists are actually trying deceive for simple political purposes and that is just normal politics.
IMO Section II should have referenced Russian Collusion. This was incredibly corrosive to the nation's political discourse, was relentlessly pushed by WaPo for years, and was false.
I had a similar thought the other day, reading a tweet from Jesse Singal that he had "absolutely no fucking clue who to believe about anything Omicron-related" due to the public health officials having "beclowned" themselves.
A binary "to believe or not to believe" is a naive question. Instead, the question is "what information can I extract from this?" Public health officials tend to be paternalistic consequentialists (e.g. saying what they think we need to hear rather than what is the most truthful), will get more roasted for being wrong in one direction versus the other, and that they, like other humans, have an inflated sense of their own importance, virtue and correctness.
Through that lens, I understand (more or less) why public health officials have exaggerated certain risks (e.g. outdoor transmission), overstated the benefits of certain interventions (e.g. masks), flatly denied that there is evidence of efficacy for interventions with conflicting, generally positive but low-clinical-significance effect estimates (e.g. ivermectin), and have been painfully slow to update their guidance in the face of new evidence.
If a public health official says "yes some evidence suggests ivermectin has some efficacy, but if you're so damn interested in an efficacious intervention go get the god damned vaccine", some people will only hear "ivermectin is effective." Based on that statement, perhaps 1000 people who might have gotten the vaccine won't, thinking that ivermectin will ensure their health if they get Covid. More saliently, they open themselves up to the criticism of their paternalistic public health peers. So of course they don't want to say that. Is that irritating? Yes. Is it the right move? Not sure... it's hard/impossible to predict the consequences of people hearing "ivermectin is effective" vs the consequences of "yet another misleading statement from public health" if the evidence of efficacy is denied.
That isn't to get them off the hook. I dislike the paternalism of healthcare generally, as it negatively affects me personally and I suspect is an overall "inadequate equilibria" (which is to say, there is a better way). I also get why people distrust public health officials. But what I don't get is Jesse (a smart and perceptive dude) having "no fucking clue" who or what to believe. Or rather, I don't think his problem is actually epistemic - he *does* have a fucking clue. He's really just stating his objection to misleading statements. I am sympathetic, though ideally he wouldn't be broadcasting "we can't know what to believe!" when, IMO, we have enough information to triangulate on probable truths.
Regarding "the fake news that falsely claimed that Saddam Hussein was undertaking a major weapons of mass destruction program", then what exactly was it which in 1981 —in Operation Osirak— that the Israeli Air Force destroyed literally days before it was about to go critical (i.e., the nuclear reactor cores fired up, after which the fallout released from their destruction would be potentially devastating to the Iraqi civilian population)? Yet another totally Innocent aspirin + baby food factory?
> I’m not blaming the second type of person. Figuring-out-the-rules-of-the-game is a hard skill, not everybody has it. If you don’t have it, then universal distrust might be a safer strategy than universal credulity.
This is where I have a failure of empathy. Figuring-out-the-rules-of-the-game is an obviously important skill, and growing up I had it as part of the school curriculum three times before I was twelve. Sure, there's a long list of critical details about how journalistic sausage is made (ex: headlines are written by other people, Opinion articles exist in a different universe from fact-checkers, "editor" as a job title is meaningless, etc.), but at a base level I don't understand how someone lasts a decade on the internet without learning to parse articles for *what is actually being claimed*. Not the selling point, not the impressions, not the feeling it tries to leave you with, but the factual information it attempts to convey. (Or just as importantly - the lack of any such.)
I'm not up to writing it all out right now, but I'll caution against portraying this as a one-dimensional binary: the opposite of the savvy reader is not merely unskilled at parsing truth, but also actively disengaged. The epistemically dangerous territory is where people get burned by not understanding the rules, and instead turn to areas where not even those rules apply. (Yes, I'm talking about social media. Comments sections very much included.)
Scott: "The reason why there’s no giant petition signed by every respectable criminologist and criminological organization saying Swedish immigrants don’t commit more violent crime than natives is because experts aren’t quite biased enough to sign a transparently false statement - even when other elites will push that statement through other means. And that suggests to me that the fact that there is a petition like that signed by climatologists on anthropogenic global warming suggests that this position is actually true. And that you can know that - even without being a climatologist yourself - through something sort of like “trusting experts”."
I think that even if they did commit more crime, experts wouldn't sign it.
I think that some scientists wouldn't attach their name to that petition because it is not politically correct. If your name was on that list, it might not be very good for your career or image. Here  is a petition of 1288 health experts signing a letter about the George Floyd protest gatherings which says that:
"However, as public health advocates, we do not condemn these gatherings as risky for COVID-19 transmission. We support them as vital to the national public health and to the threatened health specifically of Black people in the United States." and "COVID-19 among Black patients is yet another lethal manifestation of white supremacy."
My argument isn't that they are wrong but that this wouldn't go the opposite way. For example, if it was true that large public gatherings were risky for COVID-19 transmission, I don't think you would have a lot of experts putting their name on this letter to condemn protestors. Maybe though. I think there is political pressure to be publicly open about certain scientific positions and a lot of pressure to be quiet about others. If there is a petition for a topic that is politically damaging to the reputations of the signers, then it should be weighed more heavily.
I am trusting of experts but the blending of politics and political discrimination within academia creates a filter which leads to consensus on issues that are left. I wouldn't be surprised to see a consensus that says Biden is better for the economy among economists because Democrats outnumber Republicans. And Republican economists might be more hesitant to reveal this.
"Right-leaning academics experience a high level of institutional authoritarianism and peer pressure. In the US, over a third of conservative academics and PhD students have been threatened with disciplinary action for their views while 70% of conservative academics report a hostile departmental climate for their beliefs.
In the social sciences and humanities, over 9 in 10 Trump-supporting academics and 8 in 10 Brexit-supporting academics say they would not feel comfortable expressing their views to a colleague. More than half of North American and British conservative academics admit self-censoring in research and teaching." 
Are you using "clueless" and "savvy" in the sense of the Gervais principle? Maps pretty well.
The response to this is many things. But one of the responses to this is that you assume that the liars do the same quality of lying all the time. It may be that politicians are willing to lie about tax increases by calling them something else, but not willing to lie by completely denying them. But it *also* may be that their willingness to lie varies depending on political whims. Remember "read my lips, no new taxes"? That was a more direct lie than just saying that taxes are revenue enhancements. But that wasn't how Bush typically lied, and indeed when he raised taxes, he actually claimed that he hadn't lied because the new taxes were revenue enhancements. He just switched from an unbounded lie to a bounded lie, while lying about the same thing, and using one of your own examples.
>The reason why there’s no giant petition signed by every respectable criminologist and criminological organization saying Swedish immigrants don’t commit more violent crime than natives is because experts aren’t quite biased enough to sign a transparently false statement... And that suggests to me that the fact that there is a petition like that signed by climatologists on anthropogenic global warming suggests that this position is actually true.
This is only true for trivial reasons: If the media or politicians believe they can get away with telling a lie, it can't, by definition, be transparently false.
But I'm pretty sure even you can think of cases where prominent officials or media came out in favor of something that was false, and known to be false at the time. You can look at a lot of statements about COVID for examples.
This doesn't seem to touch the thing that I assume that FOX News does to actually mislead people (I don't have FOX News, but I'm extrapolating from how the Sun and Daily Mail behave), which is that if there were over the course of the year 10 mass shootings, 9 by people with nice white sounding name, and 1 by Abdullah Abdul, they would not report the first 9 at all, and fill the airwaves for months with speculation about Abdullah's terrorism.
And "liberal" media TOTALLY does this lying-by-omission thing. There are lots of stories that I think are interesting and important that The Guardian doesn't cover because they reflect badly on "our" side.
[I really can't stand the tribalism of it all, hence the scare quotes, it's easier to use these reductive labels in a discussion but etc etc etc ]
To be fair, the NYT and the WaPo are not exactly Fox News, and they too were parroting "unnamed highly placed sources" who insisted that Iraq was chock-a-block with WMDs, and later parroted other conspiracy theories in support of Empire.
I have similar contempt for them that I have for Julius Streicher and Alfred Rosenberg. And Fox, for the record.
I enjoyed the article, but I'm not quite sure of what the take-aways are, other than: "Don't believe everything you hear. Don't disbelieve everything you hear. Keep working on improving your map." Which is just par.
I tend to lean very hard on the "trusting the establishment" heuristic. Needless to say, it makes me a bit of a pariah in many circles. If we had a regime where more people actually read articles rather than headlines and at least skimmed the Wikipedia article on a subject before commenting on it, discussions on politics would improve exponentially.
On the other hand, I struggle with how much of this is because I might have skills or heuristics that other people just fundamentally don't have. And furthermore, where those heuristics come from. Did I learn them in school and could it be improved though education, or is it just an individual quirk of me and people like me?
> I'm a liberal who doesn't trust FOX News, and sure, I believe it. The level on which FOX News is bad isn't the level where they invent mass shootings that never happened. They wouldn't use deepfakes or staged actors to fake something and then call it "live footage"
How do you know?
I am not asserting that this is true. I am not asserting that Fox news, or any other news outlet, does this.
What I am asserting is: _**IF**_ they did this, you would not be able to know.
So I am curious: What is the basis of your epistemology such that you can say with confidence that Fox News wouldn't make up video coverage of a shooting?
(And, lest you forget, I will remind you that similar falsifications have been documented repeatedly across all news agencies. https://www.nytimes.com/2019/10/14/business/media/turkey-syria-kentucky-gun-range.html is the most recent example I remember. Story: SPOOKY TURKISH TERRORISTS SHOOTING UP EVERYTHING. Video: a gun show in Kentucky. Why are you so certain that Fox would _never_ do this, when ABC did exactly the thing you are describing, three years ago?)
> Even if I learned of one case of them doing something like this once, I would think "wow that's crazy" and still not update to believing they did it all the time.
Well this is interesting and probably highlights a fundamental difference in epistemology between you and I
I assume everything is always at equilibrium, unless actively disrupted. That means, if I catch the news doing this _one time_, and I can't point at a unique and specific explanation for it, I will assume it's been happening all along and I only just noticed.
This seems so obviously correct to me that I'm curious as to how you can convince yourself that "oh it just happened once"
If we want to know the accuracy of an expert, we can ask them to make predictions which can be verified as Tetlock discusses in Expert Political Judgement and Superforecasting. I don't think news outlets would be very happy to hand over predictions from their writers but maybe somebody could sort through and find a prediction in articles and then put it on a website. You could score them to get something very roughly like trustworthiness or reasonableness. There would probably be a lot of variation in authors.
> Or "FOX is against gun control, so if it was a white gun owner who did this shooting they would want to change the identity so it sounded like a Saudi terrorist".
They do this all the time, +/- a technicality.
Every news outlet aggressively suppresses demographic data when reporting on crime, if and only if that demographic data is not aligned with their narrative. See, for example, Coulter's Law. Sure, it's not technically lying, because they didn't present a false identification. But selectively hiding the identification when it's editorially convenient is no different
One additional complexity is that this heuristic of distrust is not symmetrical. What you say about FOX is true, but you couldn't turn that around and apply it to the New York Times. The two sides of the political aisle have different epistemologies.
The Right twists the truth because they see themselves as working in service of the Truth, or at least America. A signifiant contingent on the Left have disavowed both Truth and America. In their value system, truth claims are only a method for subverting some power structure or other, and are judged not primarily by accuracy but by how well they support whatever progressive narrative is in vogue.
This is turning into a fisking. I apologize
> There are lines you can cross, and all that will happen is a bunch of people who complain about you all the time anyway will complain about you more. And there are other lines you don't cross, or else you'll be the center of a giant scandal and maybe get shut down.
Are you... are you watching the same country I am?
Ivermectin ... probably does some good. For most of us in the northern hemisphere, we don't carry much of a parasite load ... yes, carrying a parasite load is a thing. For others who live in wetter climates, where eating a tomato fresh off the vine can expose you to parasites, things are different. Probably all of us carry some slight parasite load, and setting those parasites back a bit, probably benefits us. Antihelmenthicides don't necessarily kill off all the parasites like a magick silver bullet, but just set them back a bit ... or maybe quite a bit based upon dosage. Does Ivermectin have other benefits? Yes, I read years ago—when I was a cowboy, and using a lot of ivermectin—treated populations in South America saw a reduction in certain types of cancers. Not that I ever intentionally treated myself ... but the form of ivermectin we used was the pour-on form. You have a bottle with an open chamber on top, you squeeze the bottle and the chamber on top fills with the desired dosage ... and ideally you pour the ivermectin in a stripe down the center of the animals back, just as you do with flea drops on your cat. But there you are with an open cup of ivermectin trying to pour it down the back of an eleven hundred pound animal that is scared, fighting, and doing a pretty good job of kicking your ass ... and things get a little wild, and you wonder who received the better part of that dosage. But back to Dr Malone ... who developend mRNA technology, who is fully vaxxed, and who works with alternative uses for existing meds ... and suddenly this guy is a pariah. Something is going wrong.
Now on to Climate Science ... all you have to do is read the Climate Gate files, and consider that anyone who says "whoa, lets think this through" and that person is labeled a science denier. Just last week we learned that climate change causes volcanic eruptions ... Ummm where are the adults in the room. And climate change causes floods, and droughts, and warming, and cooling, and everything is weird because of climate change ... and maybe the actual real changes due to climate change are so very slight that no one ever will feel it ... unless you're on the spectrum, and you can see CO2—a clear gas—in the air.
Here's the real problem with climate change. Global warming is causing polar ice and glaciers to melt. This increases sea level. Increasing warmth causes sea water expansion further increasing sea level, as a matter of fact, this causes acceleration in sea level rise. Great, we have something we can measure. Now go look at actual sea level rise data from NOAA. https://tidesandcurrents.noaa.gov/sltrends/sltrends_station.shtml?id=9414290
Its about 2mm per year, and as you can see, pretty flat and steady. According to the IPCC AR5, CO2 was not in sufficient concentration to affect the temperature until about 1950. But we see in San Francisco data that global warming caused sea level rise back in 1854, a full 100 years before CO2 could have done the job. So there's a lot going on that we're being lied to about.
> In the end, I stuck with my believe that ivermectin probably didn’t work, and Alexandros stuck with his belief that it probably did. I stuck with the opinion that it’s possible to extract non-zero useful information from the pronouncements of experts by knowing the rules of the lying-to-people game.
What do you do when you don't know those rules?
After all, you can't learn them by asking people; they might be lying to you
> The people who lack this skill entirely think it’s crazy to listen to experts about anything at all. They correctly point out time after time that they’ve lied or screwed up, then ask “so why do you believe them on ivermectin?” or “so why do you believe them on global warming?” My answer - which I don’t think is an obvious or easy answer, it’s a bold claim that could be wrong, is “I think I have a good sense of the dynamics here, how far people will bend the truth, and what it looks like when they do”. I realize this is playing with fire. But listening to experts is a powerful enough hack for finding the truth that it’s worth going pretty far to try to rescue it.
If I understand you correctly, you're essentially saying "trust the experts if and only if they say things you already independently know (or strongly suspect) to be true".
That's equivalent to saying "don't listen to experts, just listen to yourself".
So at least we both agree that we should not listen to experts, because they lie.
> And the clueless people need to realize that the savvy people aren’t always gullible, just more optimistic about their ability to extract signal from same.
After watching what you fascists did to my society in the name of 'health' last year, there is zero signal to extract and you're all just either hallucinating one, or brazenly appealing to authority to attempt to pull rank over me
Always looking for new people to add to my list of interpreters. Anyone brave enough to post some they've found? Joe Wisenthal in finance (odd lots podcast) is my favorite example of this.
>But the experts in the field won't lie directly.
To an extent. Here's some examples of tactics common among historians:
1) Quote someone approvingly, or build upon their work, etc. etc. while leaving out or brushing over their lies (or directly quoting the lies approvingly, but leaving just enough room so you can say they said it and not you). See for instance Vince Deloria and Red Earth, White Lies.
2) Make an unsupported statement that at the same time can't actually be disproved. Very common in stuff like art history, where you can make all sorts of claims about the author's intentions. My favourite example is a claim that there is a link between trains and atrocities like the Holocaust - not based on any claim that railways make committing atrocities easier, but rather based on some claim about passengers being essentially trapped on a train until it stops - again, not in any sense that the supposed link was because it made committing atrocities easier, but rather something psychological.
3) Just plain lying and hoping no one will notice. See Arming America.
A few hours late to the posting frenzy and not sure if anyone will read this, but this made me think a lot about my bounded distrust of science (I'm a biologist) and other peoples' trust in science. I think Scott's 5-HTTLPR post on the old blog summarized the dynamic in science beautifully. If something becomes a possible right answer, there will be an endless stream of small scale studies "proving" it, which is why there are a few hundred studies showing a wrong link between 5-HTTLPR genotype and depression. Likewise:
- It is zero surprise to me that there are a bunch of small scale studies showing that Ivermectin treated COVID and that these did not reproduce in the large TOGETHER study. The Alexander vs. Marinos vs. Katz argument about what small studies to include and exclude seemed almost meaningless. Infinite small studies can be wrong.
- I expect there will be hundreds of studies showing that Long Covid causes virtually all human medical conditions. There are some already and will be more in the coming years.
- There will be lots of studies showing neurological impairment or whatever from lockdown or being a child during the COVID period. These are also starting to emerge
- There will be a bunch of studies showing longterm vaccine side effects, although fewer than Long COVID positives because this is a less respectable right answer.
And of course most of these findings will be wrong.
And yet I don't "mistrust science". When Pfizer does a study on their vaccine I assume it's correct and I got vaxxed and boosted as soon as I could. There's a subtle difference here. If Pfizer were doing 50 small-scale studies on different vaccines I suppose I'd group them with the myriad Ivermectin studies. But holy crap, how does anyone unfamiliar with the subtleties of our information ecosystem believe anything biology produces anymore? You are constantly bombarded with our wrong studies and yet you still haven't burned our labs to the ground yet. Perhaps there's been enough unambiguous successes (polio vaccine, your children not dying all the time) that the general population is willing to forgive us for being wrong a lot, but I feel like COVID has exposed how much wrong crap we publish in a way that never really happened at this scale before.
This article puts a label on why I was so furious about the "who cares, politicians always lie, take him literally but not seriously" business during the Trump presidency - because it imagines that all lies are equal. If the New York Times writes a slightly misleading headline then you can't trust them at all, they're as good as the people who think the moon landings are fake.
If the Soviets say that the harvest is "glorious" when they really mean "good", that's still correlated with reality - it's not as good as they claimed, but you probably aren't going to starve this winter. If they announce that *every* harvest is glorious, and each fall has greater and greater surplus no matter how many people are starving in the streets, you get *zero* information. The lie is not connected to reality at all.
We can't force every publication to provide only perfectly accurate information, for many reasons. But we can put bounds on what sorts of lies are acceptable, and make sure that they don't stray too far from the truth. And that makes it important to (1) call out blatant lies, to prevent the boundary of acceptability from shifting, and (2) draw a distinction between calling something out as misleading vs false, to avoid communicating the idea that they're all equally lacking in information.
Boring linguistic point: Abd-ul-Allah, contracted to Abd-ul-lah, means Serves-the-oneGod. Abd-ul just means Serves-the. When you see the name Abdul, it's usually barrelled with a second element, like Abdul Rahman: Serves-the-Merciful. (The Merciful is also Allah. He seems to have a lot of names.) Abdullah Abdullah and Abdullah Abdulrahman would be fine names for a Saudi terrorist. Abdullah Abdulhussein would not, since Hussein (handsome one) isn't a name of Allah, and Salafi Muslims sniff at names that imply servitude of anyone but Allah.
I describe mainstream/liberal media (NYT not Fox) thusly:
"They won't lie to you, but they won't always tell you the truth"
Historians understand problematic issues related to "objectivity", and journalists are even more subject <sic> to these because of the temporal difference. A prior, clear statement of principles and beliefs has always been useful for evaluating historical or journalistic interpretations of data or evidence. If someone tells me where they stand then it really helps me with verifying (or falsifying!) their ideas.
Wow, did you realize when you published this that Alexandros had already quote-tweeted what looks like a smoking gun on IVM? Author of a meta-review seems to admit he knows IVM works, but that he plans to ease into that result over the next 6 weeks, allowing hundreds of thousands of needless deaths in the interim. https://twitter.com/alexandrosm/status/1486136274385702912
On the autistic spectrum, this is the story of my life.
Everyone routinely states falsehoods, and I don't have the brain wiring to easily and intuitively figure out their intentions, including whether there's any intention to deceive.
I stated one in my second paragraph - somewhere, there's almost certainly some human who never states falsehoods, even if it's only a pre-verbal infant.
A high functioning autistic learns, often painfully, the specific rules that govern the statements made in their cultural niche, and what variations are common or possible. They then watch these deduced rules get violated in increasing numbers, conclude there's been yet another cultural shift, and work on deducing the new set of rules. (Or they move to a new sub-culture, and find the rules there unexpected, having been told (falsely) by non-autistics that their local rules are self-evident aspects of human nature.)
FWIW, at the moment I don't have a good sense of what falsehoods are acceptable in advertisements made in the US - beyond far too many. I don't have a good sense of what falsehoods are OK in support of political positions - is there *anything* Trump wouldn't say, if it benefitted him? And while I don't expect public figures among his political opponents to make *checkable* false claims about elections, or the numbers at a public event, that's about all I'm sure they wouldn't do.
Maybe the Soviet Union used coded language, as you suggest, preserving meaning once one learned the secret decoding key. That's certainly common in both resumes and job advertisements, with a few outliers producing entirely fake experience, and (more commonly) non-existent bait-and-switch job ads. There might be a similar way to decode advertisements and political speech, but AFAICT it's easier to make no checkable claims, beyond "this product is infinitely wonderful [in unspecified ways]" and "our political opponents are infinitely evil [attached to a list of lesser sins, often unverifiable]".
With the rules changing constantly, I'm not entirely sure it matters. Major mistrust makes sense, along with a heavy dose of epistemological uncertainty.
This and the EEG study arrived in my inbox 3 hours and 2 minutes apart, and this amuses me.
I like this post, I agree with it in theory, but in practice I kept saying "but I'm not sure you are making your point here." For example, the argument that Fox wouldn't report on a mass shooting event if it didn't happen while other news wouldn't report on there being no fraud if there was. These don't seem equivalent to me. One produces tangible bodies. The other produces anomalies in paper trails that due to our preference of anonymity over security are hard to validate. And I say that as someone who doesn't think there was (above normal background levels of) fraud in the election.
And this makes the proximity to the EEG post amusing. Because here is batch of articles with easily referenced evidence with holes ready to be poked in that yes a handful of experts I don't follow on a platform I don't use shot down, but it isn't like the usual suspects are going to substantially correct their news articles are they? What is the difference between yet another poverty/child development story and yet another there was no fraud story? At what point can you be confident you are actually threading that needle instead of just suffering from Gell-Mann Amnesia?
This reminds me of a post by Jacob years ago, something about how unless you are REALLY REALLY smart, taking the stupid route in a game is more effective than the slightly smart route, and it's hard to know which category you are in.
The best analogy is the legal system. Everyone understands that the lawyers are advocates for opposing sides and that they will be cherry-picking the facts and spinning the conclusions and inferences to be drawn from those facts. They are both "untrustworthy" as statements of the truth. But, the two arguments (and counterarguments to those arguments) are expected to include the relevant facts and analysis to get the truth. So the Judge has to winnow out the inadmissible and unreliable evidence and decide who is most persuasive.
Basically, you have to read the NYT as a plaintiff's brief in support of the woke left's case, and then go out and find the counterarguments and act like a judge.
Sorry typos, I am on my phone.
people, your readers expecially (it seems), are scared to believe that they aren't tracking geo-politics. You aren't.
you pay attention, but what you pay attention to is a moving target reported by interested parties. It isn't what they report so much as what they don't/wont/ aren't permitted to report.
you can't spend you life pretending that knowing that believe that you are ABLE to track what is going on better enables you to actually do it. You can't, and it doesn't (in my opinion).
I am not a fan of corporate media, so you could easily dismiss this as an argument against certain sources of information..
Travel the world, notice the discrepensies.
ask koreans how much they actually think about north korea in everyday life.
ask an israeli citizn soldier what they KNOW/witness on the front lines (take with grain of salt of course, but ask ten!).
These facts you can hold to be self-evident.... we do not have control over what we do not know, and every "fact" we are willing to gobble up that is afforded us by interested parties can be, and often is, a tool used as a means futher an agenda.
it is that complicated and it is that simple.
Might be beating a dead horse here, but since it's relevant to the topic: I used to think the same way Scott does, until COVID happened.
Surgeon general: "seriously people - stop buying masks! they are not effective in preventing general public from catching coronavirus"
Fauci said something similar. (he later commented that he knew the masks worked, but needed to say it to save resources for medical staff)
Using weasel words to imply one thing but actually mean another is part of the game, outright saying falsehoods kills the game.
This is a great post!
Personally, I like to think of myself as a savvy conservative. I watch both liberal and conservative media, and I try to understand the biases and compensate for them.
However, personally I think there is a real problem with the way we do science. For example, if it's true that immigrants cause more crime (and it generally is), then scientists shouldn't be punished for saying that. Truth should be the ultimate defense against accusations of being a bigot. If immigrants do indeed cause more crime, then THEY should be the ones who suffer the consequences of that fact - not the scientists who simply point it out. Because ultimately, it's the behavior of the immigrants that needs to be corrected, not the behavior of the scientists, and that behavioral adjustment can't happen if scientists are attacked simply for pointing it out.
Part of the reason that I spread conspiracy theories to destabilize the status quo and bring about a new world order is that I believe a society where we are not allowed to talk about objective scientific data without being accused of heinous thoughtcrimes simply because we are "going against the narrative" is an evil society that deserves to die and be replaced by a society that has more respect for the truth. Do I genuinely believe most of the conspiracy theories I spread? No, of course not. But just as our elites are willing to harm innocent scientists for pushing an inconvenient narrative, I am willing to harm our elites in retaliation for them harming those scientists. If they push us, we push back. If they put one of us in the hospital, we put one of theirs in the morgue.
Institutions that are allergic to truth are evil garbage and the people in charge of those institutions or societies need to be cut down and replaced by any means necessary. If they don't like it, then they should start showing more respect for the truth and stop attacking people just for pointing out inconvenient facts that interfere with their desired narrative. We won't ever be able to eliminate tribalism until we're willing to hurt people for demonstrating that trait. And if we want to live in a high trust society, then we need to make truth our most sacred value. Without that, trust collapses and society falls apart.
As a good friend of mine once said "Aim high, but hit low."
It sounds like you're trying to categorize the types of conditions where biased agents are most likely to lie. Instead of doing this bottoms up, as you do with your examples, you should try tops down. You might come up with better buckets than I do below.
My belief is that transparency and access to the same data from multiple observers is where lying is least likely to occur (your example on shooting and the suspect) because it's falsifiable. When there isn't transparency and only a small number of people have access to the direct data (e.g., early 2020 information on natural vs. lab leak origins of COVID), lying should be the starting assumption. If the data isn't easily falsifiable ("sorry, that's a proprietary data set"), and someone has an incentive to lie, there's probably lying going on. Especially true if there is data sitting somewhere that intentionally isn't being made available.
This reminded me of this Military History video. It really made me think about what it would be like to grow up exclusively within a biased environment, and how hard it would be to recognize it - not recognizing the bias, but recognizing the scope.
Soviet Perspective: Invasion of Poland 1939
Also on the dystopian scifi/fiction front, Kameron Hurley's "The Light Brigade". Even when you know someone's lying to you they can still fool you into believing a different lie.
This account of how less sophisticated people think fits my experience as an attorney in contract negotiations between companies and unsophisticated parties (such as the negotiation of an easement for a utility line across somebody's property). The problem is not that unsophisticated people are are credulous; it is that their suspicion is unfocused and random. They lack the basic skill of reading a contract and distinguishing between boilerplate terms and material terms. So they will sometimes fixate on completely innocuous terms that are not really up for negotiation and miss the places where they are expected to barter for better terms. (Hint: ask for more money.)
Similarly, my grandfather's dementia manifested as a paranoia about financial matters - which is apparently very common. Maddeningly, this fear actually made him more vulnerable to people selling dodgy financial products, who sold them as providing greater financial security.
I do think that the general public are well aware that educated people tend to hide their lies in equivocal language and the things that they don't quite say. This is why attorneys preparing for a jury trial look for expert witnesses who are willing to speak bluntly and stick to their answers under hostile questioning - they are more believable. The heuristic of only believing experts when they simply and bluntly works pretty well under most circumstances, but it is vulnerable to exploitation by con men.
I’m surprised you did not note one very big caveat, considering you yourself have written about it recently.
So, you will not hear any experts say “immigrants definitely do not commit more crime in Sweden”. This is, as you correctly note, a bridge too far.
But what WILL happen is that journalists and politicians will say something like “there is no scientific evidence that immigrants commit more crimes, only a racist would believe such garbage” and there will be deafening silence from the experts - no one will speak up and say “well yes technically that’s true but only because you’ve made it literally illegal to publish any such evidence”.
The IPCC will write a carefully researched, appropriately caveated report. NYT will dutifully report the worst case 3 sigma high end of the model as an inevitable catastrophe, AOC will use this looming doom as a justification for universal daycare, and CNN will collectively pretend that tornadoes never happened in Kansas before global warming. All of them will claim that they are “trusting the experts”. And from the “experts”? Crickets.
So the experts themselves don’t blatantly lie, but if they allow their expertise to be cited in the furtherance of a blatant lie, well, it amounts to the same thing.
I don't think it's worth the effort to dig up examples, but there have been countless outright lies from "experts", just that I am aware of in the past 2 years, at least regarding public messaging.
The heuristic of "they will mislead, but not tell falsehoods about objectively untrue things" is unfortunately not accurate on too many subjects where the foundations of civil discourse and sensemaking have been thrown out the window in favor of what would be effective at achieving a person's goals.
"The savvy and the clueless" - and now tell the one from the other plus/or make the clueless accept that label. Hopeless: 1. Some anti-vaxxers are post-grads, reading lots, and mail you dozens of links for each of their points. (One colleague of mine). Many of those do not even look bad. 2. I quick-check the news on a mainstream website. Another colleague (second. edu. only) sees that and says: "Oh, that is such an obviuosly biased source!" 3. Matt Ridley, biologist and science-author: "At the time, given that I had written extensively on genomics, I was asked often about the chances that the pandemic started with a lab leak and I said this had been ruled out, pointing to the three articles in question. Only later, when I dug deeper, did I notice just how flimsy their arguments were." https://www.mattridley.co.uk/10784?button Titled: I WAS DUPED (btw. Scott Aaronson has a nice review of Ridleys new book https://scottaaronson.blog/?p=6183 "Briefly, I think that this is one of the most important books so far of the twenty-first century."
4. Who is savvy? Who is clueless? Why should anyone pay for a mainstream media, that has to be read as the "PRAVDA"? - Would the Marx/Lincoln tale have been written, if the author thought the WaPo cared for facts? Is the author banned now? Did the NYT fire the author of that Scott-hit-piece - or the whole board? As Scott wrote: I don’t want to accuse the New York Times of lying about me, exactly, but if they were truthful, it was in the same way as that famous movie review which describes the Wizard of Oz as: “Transported to a surreal landscape, a young girl kills the first person she meets and then teams up with three strangers to kill again.”
5. Nearly no one is evil (journalists are not, just not up to it), nearly everything is broken. Some will try to be "savvy" on Joe-Rogan-level - some turn to Scotts (Alexander, Aaronson, Summner 9 - some to Scotch. SLÀINTE MHATH'!
The uncomfortable truth is that - at an elementary level - statistics for journalists are simply numbers selectively used to validate a predefined narrative. Any applied use of statistics is simply not required to get followers. Certainly stats are too abstract to apply to "experts" who provide agreeable information. Journalists you read/hear from today don't have have to have had stat(s) training, or designed any study, defined assumptions, collected unbiased data from unbiased population(s), written null hypothesis', or ever been peer reviewed. The truth is that mulit-variate analysis would correctly provide answers to so many (Covid) questions but is so foreign and difficult that in trying to meet daily journalism deadlines such reporting is too difficult, boring and complex. Moreover, such analysis simply reduces what otherwise could be a truly "sizzling" headline supported by poor assumptions and little statistical accuracy. And, THAT my friends is how journalists make money - getting followers - NOT by publishing the truth. If you're reporting on a car crash, sea turtles on the beach, sports or weather, not stat knowledge required.
This reminds me a lot of https://slatestarcodex.com/2017/10/23/kolmogorov-complicity-and-the-parable-of-lightning/
In the mainstream media, I feel that *actual lies* are rare enough that one should mostly not expect them even from less-trustworthy outlets. I think Fox News tends to be very misleading, but I also think that's a result of topic and perspective selection, mostly not actual lies outside of some occasional non-host lies which go less challenged than they perhaps should be.
The WaPo story is common enough on the left end of the spectrum, which is where my bias lies... Not a blatant attempt to mislead, exactly, but involving some pretty big assumptions or leaps that make me raise an eyebrow. This is the sort of thing I've come to expect from political media I agree *and* disagree with, just as sort of a cost of doing business of engaging with political commentary.
In science, however, I'm getting *much more* open to making the assumption that a given researcher is a big damn liar. There have been way too many studies in the last several years, esp wrt COVID, that simply cannot be the result of good, honest scientific effort. This has to be true whether you're on the mainstream or skeptical side of the various COVID arguments; SOME of the research has to be bullshit.
I've actually been having this conversation with my younger sister. She's gotten increasingly conspiracy-minded in the last few years and I keep trying to tell her that *yes*, I am aware this or that group is not entirely honest but that is not the same thing as fabricating a new reality entirely from whole cloth. There is only so *far* you can stretch the truth before ~nobody believes you.
Closely related: one of my rules of thumb over the past 5-10 years has been that if I see a complex, intelligent-sounding argument against a dominant narrative which itself seems to be completely ignored (no attempts at a rebuttal to it at least), then that contrarian argument is probably largely valid.
Counterpoint: there are good arguments that such contrarian arguments are following a sort of "just asking questions" routine where they just throw stuff out there for the purposes of poking holes in the dominant narrative, and that it's a waste of time to try rebutting them on the grounds that it's easier to start a bunch of fires than to put them out, etc. (e.g. see Sam Harris' attitude towards anti-vaxxers such as Bret Weinstein), and that this explains why certain arguments aren't getting engaged with.
I agree that journalists, experts, politicians, Very Serious People, etc. largely follow the rules of their game when communicating. I don't agree that this makes their communication trustworthy in the sense Scott has in mind, where the savvy can extract a similar (though maybe weaker) signal to what we'd get if we had the unvarnished truth.
The problem is selection effects. To take an example that Scott has mentioned before: how much of a difference in, say, New York Times reporting would we expect to observe if the number of unarmed black people killed by US police in a year was 10 vs 100 vs 1,000 vs 10,000? Surely it would be dwarfed by the difference caused by (let's charitably call them) "consensus effects", where the NYT gauges the importance / tone / narrative of the subject in public discussion (especially elite discussion) and adjusts its coverage accordingly? Yet for any single article it's near-impossible to tell how much it's being driven by consensus vs reality. You'd have to undertake a dedicated long-term research project to extract the "good vs glorious" kind of signal.
Similarly, imagine the concerns of the liberal watching FOX News in the Yankee Stadium hypothetical. They could agree that FOX was reporting facts while still being deeply concerned that its framing and emphasis were calculated to stir up Islamophobia, and that it wouldn't have covered an attack by a white American shooter the same way. And they would have a point.
In the case of news organizations at least, we can confidently extract valid signal from them about certain kinds of ground-level facts, but 1) we rely on them as much or more for *analysis*-- an overall picture of what's going on and why-- and 2) they have almost unlimited influence over *what facts to show us* and have shown willingness to use that influence in the service of their preferred narratives. The fact that they follow a set of rules requiring them to be honest about the ground-level facts they do report is a very weak constraint in this context. How much signal about the true picture of the world could actually be available from it?
This is why I think the NY Times gradually eroding their reliability is a serious issue.
If I have proof of some government wrongdoing—something so outlandish that sounds like a conspiracy theory (Gulf of Tonkin incident faked! FBI tried to blackmail MLK into committing suicide! CIA proposed killing Americans and blaming Cuba!)—I want to take it to an outlet that:
1. Has the resources to investigate and verify my claims.
2. Has the resources to protect their sources.
3. Has the reputation such that if they publish it, *everyone* takes notice and takes it seriously.
For the longest time, I think that was the NY Times more than any other outlet.
If Jacobin publishes this, we'll all roll our eyes and say "okay, sure". If NY Times publishes it, we all agree to take it seriously.
Except now...some people won't, and that's reasonable, for the reasons Scott detailed. NY Times isn't likely to make something like that up, but they're not a bastion of truth. They're merely a bastion of truth-when-it-really-matters. Someone inclined to doubt the story will simply point to all the times the NY Times has misled readers, maybe intentionally or maybe merely negligently.
Whenever NYT or WaPo eats away at their reputation, I'm reminded of what they could be, what we need them to be, and what they arguably were for a long time: widely respected and trusted.
We have no "paper of record" for investigative reporting. I have a good amount of trust in ProPublica, but they lack the necessary name recognition. I don't think anyone's filling that void. It's a really hard void to fill.
A good example of this is Iranian state TV. They're mostly not a trustable source on most topics, but then again, they won't just lie about everything without any rules to their game. It is a critical survival skill to be able to draw those lines mentally.
Well, the writer is expert at one thing--covering up for experts who lie. But, as Dan Quale, a mental patient, says, No one is fooled. By the way, who make up the majority of mental patients? I ain't going to tell.