642 Comments
Dec 22, 2022·edited Dec 22, 2022

This all reminds me of this book I read years ago called How To Lie With Statistics. It was originally published in the 1950s and still holds up very well. And it really showed that the average American has no idea how to read and interpret data, especially with how percentages work. I believe that you can make the average American believe anything with mathematical sleights of hand, as long as it view their pre-existing worldview. That's why I now roll my eyes when I see "studies show" or "data says" because data doesn't mean anything in a vacuum.

Edit: Also, I got a notification saying that someone "liked" this comment. How is that possible?

Expand full comment

This is why I (a mathematician teaching at universities) would like to strongly advocate for some sort of "interpreting data and numerical results" type course to be taught at whatever school I end up getting a permanent job at. Honestly, it would be a much more important part of those students' mathematical educations on the whole than most of the content in course topics I'm passionate about like multivariable calculus and linear and abstract algebra.

Expand full comment

I agree. I was assigned the How To Lie With Statistics book by a very logical thinking teacher in high school. In senior year, we could choose to take Human Reasoning (his class) or take Calculus. I chose Human Reasoning because it sounded a lot more interesting and less work. We learned so many interesting things: the prisoner's dilemma and iterations, logical paradoxes, social engineering techniques, Milgram/Zimbardo conformity experiments, the ethics of eugenics, etc. One of my favorite classes ever and shaped how I think now.

If anyone else here happened to have Mr. Brooks at Stuyvesant HS in NYC, you'll know what I'm talking about.

Expand full comment

Those experiments perhaps should not be relied on.

Expand full comment

Some of those experiments may have been unreliable, but it seems to me that Human Reasoning should be a required part of everyone's education.

Expand full comment

'Humans are bad at reasoning on their own, but your average high school teacher will be expected to be so good at teaching human reasoning that they will be able to overcome this shortfall and dramatically improve the reasoning of their average students.' Well at least this position is consistent with the first half of the proposition.

Expand full comment

That's too high of a bar. It's enough if the teachers are able to slightly improve reasoning in student sometimes, in areas where people generally think badly.

In particular, that seems more useful than other subjects that are taught today.

Expand full comment

Ha, I'm a high-school logic teacher (both formal and informal reasoning), and I'd aspire to be the sort of teacher Sheluyang describes!

However, I'm quite skeptical about the value of logic / human reasoning on its own for helping people identify misinformation. My favorite way of teaching logic is to pack it with Harry Potter references, the Lord of the Rings, and the Lewis Carroll Wonderland books. Students learn amazingly quickly how to reason, despite their examples all being about jabberwocks and hippogriffs and moons made of cheese. They can spot valid arguments and fallacies and cognitive biases almost anywhere, based on their exercises with vorpal swords and bandersnatches. And it all works because validity and unbiased reasoning, themselves, are not guides to soundness or truth.

Our reasoning processes are truth-independent. Logic teaches us how to reason consistently GIVEN our prior starting points. If our priors are false, our rigorous use of logic will only make our conclusions false, or else accidentally true for the wrong reasons.

One of the few ways to break free of priors is by looking at opposing arguments backed up by good evidence. But logic doesn't teach what counts as "good evidence." Evidence is discipline-specific. We have to take physics classes to figure out what counts as good evidence in physics. We take history classes to figure out what counts as good evidence for historiographic theories. We take law classes (by which I mean "the rich kids who can afford them take law classes") to figure out what counts as good evidence in a court of law. Ditto for the med students figuring out what counts as good evidence for vaccine efficacy. By the nature of the case, no general class on reasoning or data interpretation fits the bill. To identify misinformation in our priors, we'd have to become specialists in a hundred different areas.

I'm not sure what lesson to draw from this exactly. As a logic teacher, I think it's dreadfully important to know how to reason validly. But it's also important to know that reasoning is not a silver bullet.

Expand full comment
Dec 25, 2022·edited Dec 25, 2022

A very interesting comment, but let me disagree a bit with the paragraph on what constitutes good evidence: we don't need to take physics classes to know what counts as good evidence, we need to do physics *experiments*. And that doesn't take pre-requisite education in algebra and a skilled instructor with a PhD -- kids do experiments in physics all the time, by running and falling and throwing and catching things. They learn an enormous amount of basic mechanics by the time they hit double digits -- no formal classes or study required.

This is one reason all of us actually do have a somewhat reasonable instinctual feel for what constitutes good evidence in at least some physics problems, e.g. if I proposed that I could build a bridge across the Atlantic Ocean for some modest sum, or that a certain very strong man could throw an apple across the Mississippi RIver, few would be sufficiently credulous to buy that, no matter how many charts 'n' figures accompanied my claim.

With history it's definitely more difficult, because there's no such thing as experimental history. But a lot of history has to do with what people do and choose, and I would say just living among people for a few decades gives you a reasonable starting point for critically evaluating historical claims. (It can of course be taken too far, and often is, when we judge the past by the shibboleths of the present, but this is sort of a mistake of overshooting rather than complete ignorance.)

In medical education it's definitely the case that clinical experience trumps most every form of hyphothesizing, however well informed. This is why clinical experience is so heavily emphasized in medical education, and why medicine is one of the few broad social activities in which we strongly emphasize evidence and measurement.

And arguably mastering even one of these fields gives one at least a starting point for evaluating evidence in general: you insist on duplicated measurement, you prefer "blinded" studies, you insist on perusing as close to original sources as possible, you are highly aware of the possibility of alternate explanations, you look for unexamined assumptions and uncontrolled variables -- all the paraphernalia of any experimental empirical discipline. So I don't think the problem is insoluble at all, although without doubt there are difficulties, and the associated problem of when and how to trust expertise is not removed.

Expand full comment

So are the countless physics *experiments* finding evidence of ESP "good evidence"?

Expand full comment

I really enjoyed this comment and agree. I do think however, that while it is very difficult to teach people how to find truth, you might be able to teach them defenses against some common tricks. It's worthwhile to avoid being misled, even if that doesn't directly help you find the right path.

Expand full comment

Critical thinking is not a skill valued by politicians or employers.

Expand full comment

I think most employers value critical thinking,bat least if they want valuable employees. I'm sorry if your experiences have led you to believe differently though.

Expand full comment

Only for a tiny minority of supervisory employees, and usually only at the very top. Much of the premium for college education is because it is a signal for conscientiousness and conformism.

Expand full comment

Have you got concrete examples of this, as it reads like a media narrative rather than a real-world likelihood. The question I ask is simly why would you pay good money for an employee who doesn't add value?

Expand full comment

Depends on your definition of value. As Gloria Steinem showed way back in the 1970s, car companies preferred to remain sexist (refusing to advertise in magazines read by women, though more than 50% of spending on automobiles was controlled by women) to making more money.

Expand full comment
Dec 23, 2022·edited Dec 23, 2022

Regarding the signaling model of education, read Bryan Caplan on Econlib.

https://www.econlib.org/archives/2011/11/the_magic_of_ed.html

Most jobs do not require serious critical thinking, even white-collar ones (like the tax preparers at H&R Block, America’s largest if seasonal white-collar employer). Conscientiousness on the other hand is very valuable. I’d argue critical thinking may even be seen as a negative if it’s associated with challenging authority.

Expand full comment

Agreed. Can't give examples because NDA, of course. Critical thinking is "valued" as long as it is within the orthodoxy, but if someone above you feels threatened you are out. Okay, I'll give the example. The workplace I was at stated their mission that year was "Equity" in the DEI sense (in student outcomes in US public education). And, there was a lot of talk about "moving the needle". As a Data Scientist, I refined their very aspirational definition of Equity, sticking to its spirit, and constructed a metric to measure, not Equity, which is a state, but rather Inequity, which is distance from that ideal state. Similar analyses challenged their claim that their system had not done harm to students of color. Management, all Authentic and Vulnerable and all that, did not like any of this, and made my worklife impossible until I left.

Expand full comment

My reading of this was heavily biases by the NDA being mentioned. If your employer is so bad that an NDA is required to cover the reason for you leaving, I'd regard them as dysfunctional; obviously an NDA regarding your output for them may be reasonable, but if it covers why you left as well then that speaks volumes about your management. So yes, you clearly had management whose tolerance for critical thinking was limited.

And I'm now hoping that no-one is going to tell me that such NDAs are normal practice in the US, as it won't alter my prior on bad management but might alter my priors on the future of the global economy...

Expand full comment

It's weird you imagine yourself a "critical thinker" while unironically using terms like "students of color".

Expand full comment

Depends on what is meant by "critical thinking skills." If one means "ability to rapidly comprehend one's social surrounding, so you can get along well with co-workers, work cooperatively, get shit done with a team with a minimum of fuss and drama" -- then you bet, this is valued in every worker, from the bottom to the top. If one means means "ability to critically evaluate one's own job, and figure out how to get it done faster and better without requiring some giant revamp of protocols or eight other people to revise how *they* get their jobs done" -- then again, certainly, valued from top to bottom.

But if one means "evaluate from some elegant theoretical perspective how an entire division could (theoretically) be more efficient, and propose expensive and complex reorganization to achieve it" -- yeah, this would not be super valuable in a line worker. It's the kind of thing that a consultant might be hired to address, if there was some clear problem and the old hands had already been asked what might be done.

And if one means "critique the overall purpose of the firm, measure it against social goals one was taught in a freshman seminar to cherish, and propose vast sweeping re-organizations of the firm in order to accomplish these" -- then....mmm...yeah, that wouldn't be very valuable in a line worker, or even a middle manager. Arguably it might be interesting in a CEO, if the firm is otherwise floundering and the board thinks drastic measures are appropriate.

Expand full comment

Every employer I've worked for highly valued critical thinking.

And I think most politicians do as well, they're just mostly bad at it outside of their particular subject matter expertise.

Expand full comment

How do you propose it be measured/quantified in any way

Expand full comment

You’d need to come up with some way to score critical thinking (it doesn’t fit within the Big 5 model, and while correlated with analytical skills is not the same thing), then perform a regression to assess the impact on earnings or likelihood to get a job offer.

Expand full comment

I'm dubious about critical thinking being unitary. It's my strong suspicion that an "all topics" appropriate measure of "critical thinking" not only does not exist, but cannot exist for humans. (There might well be possible designs of AIs that could have such a measure.) I feel that thinking is so biased not only by priors, but also be desired conclusions that critical thinking can only be done in domains where the effect of the desired conclusions are weak...and that this varies a lot from person to person.

Unfortunately, I can't come up with any SMALL example of why I believe this. The examples all have a huge context.

Expand full comment

I think that you could get most of the way there by splitting it into two distinct skills:

1) ability to think critically about blergs (i.e. abstract toy models with no emotional valence) - I expect this to correlate strongly with fluid intelligence - and

2) ability to think in an unbiased way about something which one has strong feelings about - this one I don't expect to correlate particularly with intelligence, though personality traits might show interesting correlations as to whether the 'too emotional to think' topics are the common CW ones or some niche personal bugbear.

Expand full comment

I teach a class on critical thinking. I have units. Clear writing, use of logic, use of stats, use of science, recognizing cognitive traps, paradox... there are recurrent themes, but I also think critical thinking is a toolbox, not one skill.

Expand full comment
Dec 23, 2022·edited Dec 23, 2022

Fazal Majid wrote:

>

> Critical thinking is not a skill valued by politicians or employers

So true. Many politicians think it a dangerous fault in voters, and a critical thinking employee is often considered a liability by their boss! Loose cannons and all that. :-)

Expand full comment

There is no skill of critical thinking. There's critical attitude towards specific ideas. A person who blindly buys into one idea is critical towards other, and vice versa.

Expand full comment

It is valued as long as the conclusions are aligned with current corporate policy.

Expand full comment

High school math tracks for most kids should cap off with statistics, not calculus.

Expand full comment

Even getting an understanding of Bayes’ theorem, which doesn’t require anything but grade-school arithmetic, would do wonders. Our brains are the product of evolution and terrible at estimating probabilities.

Expand full comment

Do they not do a certain amount of that stuff in science lessons when teaching about the scientific method?

Expand full comment

I've never seen a curriculum or textbook at the K-12 level that teaches the scientific method in anything that even approaches accuracy, let alone usefulness.

The best I've seen are the few off the beaten path curricula that encourage kids to play outside and build stuff, without any particular accomplishment required by any particular date. This actually encourages children to experiment, hypothesize, test their hypotheses, and gain respect for the daunting gap between theory and reality.

But even when kids do experience that curriculum, it rarely lasts past 3rd or 4th grade, before the inevitable round of standardized testing and teaching to standards takes over, and there's no more time for play.

Mind you, I'm not saying a certain amount of standardized testing and teaching to a standard is bad. We do need educated citizens who have commited a certain big chunk of what is known to memory, so they can work efficiently and cooperatively, and there just isn't time in the standard 12 year education period to have each individual rediscover in his own way any substantial chunk of knowledge about the real world humanity has wrung out over the past 4000 years.

But it's a shame we *entirely* abandon the natural and valuable instinct of children to explore, play, and discover on their own how stuff works. It probably makes for more frustrated and narcissistic rising generations, who are more detached from reality and therefore a bit more fragile.

Expand full comment

Agreed!

Expand full comment

Do you have favorite books, lectures, yt vids, etc., in the vein of that class?

Expand full comment

I don't know who that was asked of, but I'll volunteer as if it were me. :) Yes! How to Lie with Statistics; Science: a Candle in the Dark; Lewis Carroll's logic puzzles; The Righteous Mind; Orwell's "On Politics and the English Language."

Expand full comment
Dec 23, 2022·edited Dec 23, 2022

Are you sure you're not seeing the problem as a nail because you have a particular hammer (mathematics education) at hand? I'm myself kind of doubty that the main problem with people credulously accepting "studies" or pigheadedly rejecting them is....a lack of the requisite math skills. If they were truly *interested* in determining truth by objective mathy approaches, I daresay they'd figure out how to do it reasonably efficiently.

Expand full comment

I taught research paper writing for many years and invited students with knowledge of statistics to comment on the statistical methods and conclusions in the papers they read. They did no better, and often worse, than students with no such knowledge. Probably an issue of being less interested in the context.

Over time and in discussion with math and logic professors I’ve come to the conclusion that students who are overly focused on math or absolutely uninterested in math are unlikely to develop critical thinking skills useful for assessing the veracity of what they read. There has to be interplay between formal logic and their own observations of their world.

Expand full comment

Having tried to teach college stats I'm not convinced this would help and very likely could hurt. The more rules you stuff into people's heads without understanding the more chance for manipulation.

Look, the problem with most required math courses is a huge part of the class doesn't want to be there and doesn't like doing the work and no one can think creatively in those conditions. This leaves instructors two options: give students rote techniques they can memorize or deal with a bunch of upset students who put in tons of time and did badly who then complain to admin and parents.

The more ppl remember testing for p< .05 or whatever the more that seems like a trustworthy statistic. Better if they've never taken a stat class and just have a generalized distrust than to have taken one but not understood the material.

It's a nice idea for an optional class but that won't reach those who need it most.

Expand full comment

Don't get me wrong. I could imagine a different system where this worked but you can't just erase 8 years of being taught math is that awful memorizing rote rules thing.

Personally, I'd rather see us totally eliminate the whole memorized routine stuff. We have CASs now so if you don't understand what you are doing it's better to trust the computer.

Require math only to the point where you can make change and calculate percents then make it conceptual and optional (so you don't have to keep giving studious students who hate it too much to think creatively good grades) and make CS courses required. All the students who stick in math do psuedo-proofs from that point on (don't have to be real proofs but point is they face problems designed to require creative solutions not selections from a list of 20 standard problems).

Expand full comment

If you don't understand the algorithm, doing on a computer does not change that.

I agree that guessing (for badly designed multiple choice tests) doesn't show that you understand it either.

Expand full comment

No it doesn't. But if you aren't going to understand it anyway why waste your time learning to do a bad imitation of a computer?

Understanding is worth something. Mere rote application of an algorithm is something that it's no longer worthwhile to train humans to do.

Expand full comment

I mean have you ever asked adults in non-STEM careers how much math they retain? For most of them (even in high powered careers like lawyer) the answer is nothing after percentages and reading a graph except a deep-seated fear.

We made them miserable for years of their lives and at the other end they have pretty much nothing to show for it. Indeed, they may be worse off than someone who at least didn't learn to hate the subject.

I think we need to be honest about our success rate and admit that the goals of making people understand mathematics and making it required are incompatible. We aren't succeeding at making everyone understand so lets try and not make the people who won't miserable and help those who want to be there actually understand.

(to be clear the barrier isn't intelligence or mathematical ability...it's simply dislike of the class and the feeling that they are bad at it. A good teacher can help with that on the margin but you fundamentally can't overcome the fact that to convey understanding you need students to want to figure it out -- and after our HS system many of them not only don't but feel it's unacceptable not to be taught a rote algorithm.

Sadly, alot more of these students would understand the material if that was the only way to do well. But as long as you have to give good grades for rote work they'll take that option. Thst leaves making the math truly optional or failing a bunch of ppl out of college who would make good lawyers and writers etc...)

Expand full comment

I think you're not understanding the point you're making.

E.g., I taught myself tensor calculus because I was trying to follow Einsteins' construction of Relativity. I did it because I wanted to. But I had no use for it afterwards, and these days I don't remember any of it. I think I could still reconstruct the theory of limits, but I'd need to work at it.

I had no use for the stuff after I graduated, so over the years it slipped away. I also can't remember the proofs from symbolic logic...and I use that stuff all the time as a programmer. But I don't use the proofs. I work with integers, if tests, hash tables, files, that kind of thing. And what I remember is how those interact. I'd even have to look up how to properly invert a matrix.

So whether you want to learn something of not *is* important, but it doesn't determine whether you retain it. You retain it if you KEEP wanting to retain it, e.g., if you use it. If you stop thinking about it, it slips away.

Your argument is a special case of this more general argument. (Paragraph of context and exceptions deleted.)

Expand full comment
Dec 23, 2022·edited Dec 23, 2022

We generally don't teach math to everybody in the hopes that a middle manager at Chrysler is going to factor quadratic equations every day on the job. We teach it so that (1) people know math exists, and what can and can't be done by it -- that it's neither mappable onto common sense nor some kind of black magic that can do anything -- so they have a basis for evaluation when technical experts, hucksters and politicians say "This was done by math! Believe!" -- and (2) so that people practice a certain kind of rigorous and empirical thinking, e.g. "I got a root of negative a billiion, maybe I'll just pop that back into the original equation to see if it works," since we have found that rigorous and empirical thinking is valuable, but doesn't come naturally to human beings.

That's why it doesn't actually matter that much what *kind* of math we teach, only that it require clear and logical thinking and mental self-discipline, and that it relate at least somewhat to the kind of math that *is* used in technology and science today.

Expand full comment

I think the primary reason so many people fail miserably at math is the same as the reason why they fail miserably at learning a foreign language, or learning to play an instrument: It's one of those things that you just have to commit yourself to doing every day if you want to become proficient at it, and most people--even as kids-- just won't commit to that.

As an example, both of my kids have (or will, my daughter's only in 10th grade) 'taken Spanish' every year since kindergarten, and I wouldn't trust either of them to order off a menu in Guatemala. And I don't think it's because the teachers sucked, or the curriculum sucked, or anything like that-- I just think it's down to only doing it 2-3 times a week, which is simply inadequate.

Tying this in with math education, most people realize by the time they're 12 that they can slither through by only engaging with the course material 2-3 times a week, because most math classes are moving so slowly by then that daily study isn't required. Which is great as a kid, but it does lead us to the point where your average 18 year old is going to stare at you like a dachshund who just shit on the rug if you ask them to divide two fractions.

Expand full comment

I've seen this argument all my professional life, and in my experience of people, it's deeply flawed. Those who memorize nothing and look up everything are very rarely creative or show genuine understanding, they are completely at the mercy of their sources. It's like Internet commenters who know zilch about a subject except what they can glean from rapid googling and reading through a Wikipedia article. They have a surface familiarity -- can use the jargon, and repeat the conventional wisdom, maybe even paraphrase it creatively -- but they lack any deeper understanding.

Conversely, people who *do* exhibit a deep understanding not only very often have vast amounts of the useful data memorized, at their mental fingertips, they often have emphasized memorizing key points and data all through their learning[1], because it allows them to reason about problems in the field for quite a long way before they need to look anything up. That allows them to fully develop their own reasoning structure, better integrate it with their instincts and typical reasoning paths.

I wouldn't know what aspect of the human brain is responsible for this, but it's what I see. Maybe the problem is that when you have memorized almost nothing, what you have in your head is essentially a bunch of verbal/social scripts, since you lack the "data" anchor points to which you might otherwise have attached chains of intuitive (and nonverbal) understanding.

Another way to look at it: we all know experience teaches better than any amount of lecture. If you want to know how to rip a 8 foot 2x4 on a table saw without taking your fingers off, actually doing it once is worth about infinity hours of lectures, or even watching videos, on it. You just take in far more from your senses when you *do* it then when you watch it. Similarly, when you actually do math problems you learn way more than when you watch a teacher demonstrate them on the whiteboard. But this all suggests there is some necessity in human thinking to have some kind of experential anchor points for solid learning to take place. And that points to memorization, to the incorporation of factual data as the anchor points.

Of course, one can go to far, and require memorization of stuff that can easily be looked up and probably should be. In college I once took a chemistry class that required me to memorize the *entire* Periodic Table (as it was then), and this was clearly too far. What value can be found in knowing without looking that tellurium is next to antimony?

But I don't get the feeling that this is what is most usually meant (although cases like that are often served up as justification). Unfortunately, what I think is too often meant is that people are complaining that learning is itself hard, requires a lot of practice and memorization, and shouldn't there be some kind of nice shortcut? Can't you just learn to wave your hands and parrot the established conventional wisdom and that be sufficient? Increasingly, the social answer has become "sure!" and the consequences in terms of declining productivity and invention, not to mention adult credulity and innumeracy, are already manifest.

----------------

[1] I'm reminded of Feynman's recounting of how he memorized all the common square roots, because he found it useful to do little calculations mentally to quickly check his reasoning as he was noodling along working out some theoretical new path. Having to stop and get out a calculator would've derailed the train of thought, or more likely he would've just skipped the quick check, with an impact on the eventual rightness of the new path.

Expand full comment

I think maybe the rote learning helps with "chunking" concepts or otherwise frees up working memory? A thing you looked up has to be mentally stored and transferred to where you need it, something you 'just know' doesn't take the same resources.

On the other hand, rote learning like this tends to be very topic specific and will happen for whatever a person actually specialises in, and probably doesn't need to happen for the sort of broad learning one does in school

Expand full comment

Or it helps you to build an initial mental framework for a problem quickly, because it’s easier to see how a new data point might fit into the larger structure. (Carl’s note about Feynman is relevant.) Then, once you have your framework, you can rapidly develop hypotheses to test and make sure you’ve understood the problem correctly.

If you’re building the framework from scratch each time, you’ll be slower and you’ll miss important components. So, your hypotheses will be weaker.

Expand full comment
Dec 24, 2022·edited Dec 24, 2022

Why doesn't it need to happen in school? Heavens, the whole reason we sacrifice the valuable labor of young, strong, healthy people aged 12-22, and instead let them loll about all day studying books -- at a staggering cost to the rest of us, who have to work harder to make up for letting the kids be grasshoppers instead of ants -- is because we think they'll come out the other end way more qualified to do sophisticated work, and the quality of the work they do in their late 20s and early 30s will more than make up for their missing decade of contribution.

I mean, if school is only meant as some kind of aristocratic finishing school, where people learn good manners, how to elucidate the social shibboleths eloquently, and the modern equivalent of being able to produce an apt quote of Cicero in a genteel argument -- we can't afford that for everyone. If everyone is going to spend his late adolescence and early adulthood being idle, he damn well be getting some very valuable practical skills out of it. If he ends up needing *more* training, on the job, to actually know how to do something useful, then school is a gigantic waste of public resources and should be reserved for the aristocracy, who can pay for it themselves.

Expand full comment

Lots of good (bad) examples here:

https://viz.wtf/

Expand full comment

I often tell people that I've taken two really important classes in my literally long life - typing in high school (to have a class with my girlfriend) and entry level statistics in college (to satisfy my math requirement).

Expand full comment

there is a similar on-line course from University of Washington: https://www.callingbullshit.org/syllabus.html

Expand full comment
Dec 22, 2022·edited Dec 22, 2022

Subhead to WaPo article from this morning seems pertinent to lack of statistical knowledge and media misrepresentation: "An American child born in 2021 could expect to live to 76.4 years, according to the latest government data" (from https://www.washingtonpost.com/health/2022/12/22/us-life-expectancy-decline-2021-covid-fentanyl/ ).

It equates the average life expectancy today with how a child born will do in old age, which are two very different concepts. Covid and fentanyl effects on life expectancy are horrible, but are also unlikely to be the cause of death for babies born in 2021.

Expand full comment

Journalist seem as helpless as the average Joe to interpret numbers. My local newspaper rarely manages. (And as a language-teacher without massive dyscalculia I noticed: many a colleague and textbook-authors share this.) - The average journalist (without a strong background in STEM or economics) is unable to lie with statistics. Cause they are unable to understand them in the first place.

Expand full comment
Comment deleted
Expand full comment

True. The "journalists" today have never had a job until they signed on with some publication somewhere out of college. They have no math or science, and what little history they might have studied has been so corrupted they might as well not even bothered.

Expand full comment

That as may be, but they're doing great at their actual jobs - nothing described above is a failure of journalism, it's a failure to understand what journalism is and is meant to do. Ladeling out this slop without even understanding it well enough to signal dishonesty is the point, as is the point of the 'press secretary' we all have in our own minds. Journalism is one of those irregular verbs: I'm a fearless truth seeker, you are honest but misled, he is both a liar and an ignoramus.

Expand full comment

Thomas Sowell has a good angle on this, saying that the intellectual class have a greater ability to believe what they would like to believe by having the facility to invent complicated theories and choosing not to test them. "A little learning is a dangerous thing" as the saying goes

Expand full comment

True, they are not lying per se, but they are lying about their ability to provide useful information to their readers. And are so insufferable about it at the same time.

Expand full comment

Although I am very confident that journalists are insufferable I don't think they know enough about statistics and advanced logic to assess their own ability in those areas. This is the key problem that we need to get out there to accelerate the end of the Late Pre-Truth Era (hat tip to Scott's tweet)

Expand full comment

Actually, "life expectancy" data are very difficult for anyone not specifically educated in demography to interpret, because the life expectancy of someone born today is not an expected value in the statistical sense. It is basically the average age at death of people today, which may have little or nothing to do with deaths in 2100 (which is roughly when a child born today is likely to die).

Expand full comment
Dec 23, 2022·edited Dec 23, 2022

Dyscalculia is an interesting idea. What percentage of the population is so helplessly challenged by math and formal explicit reasoning that they will never be able to grok stats at a sufficient level to not be easily duped? An experiment on replacing calc with intro stats sounds good but I could easily imagine it'd backfire by giving people overconfidence.

Expand full comment

I assumed, by analogy, that dyscalculia was a visual processing issue with *reading* maths, rather than an inability to do it (much like how dyslexics are fine with spoken words)? One can develop tools to work around visual processing issues if it is that, though I can trivially see how they'd make a conventional school curriculum near impossible to follow

Expand full comment
Dec 24, 2022·edited Dec 24, 2022

Maybe that was what was meant, but I meant a strong inability to process formal mathematical computations rather than anything sensorily driven. I believe this affects the majority of the population, although it's unclear how much of it is due to a lack of effort and poor training and how much is truly due to innate cognitive faculties. How much of an issue would this be for the rationalist movement if a substantial portion of people couldn't calculate stuff? I guess not too much for things like "donate to these orgs and not to these ones" and "recognize qualitative cognitive biases in yourself" but for things like "forecast events in your personal life via Bayes law and probability theory [i.e., recognize quantitative cognitive biases]," I'm not hopeful.

More generally, the entire field of behavioral economics is built around detecting these biases with questions like "which sequence of coin flips is more probable: HHHHH or HTTHT?" Each sequence has equal probability of occurring, but only people who think in highly technical terms even approach the problem in this way. The 95th percentile of adult (even college educated ones) does not think this way and instead is answering a different question than the mathematically formal one that was intended: "which sequence is qualitatively more similar to a typical sequence in the information theoretic sense?" which is totally valid for everyday reasoning and so they answer the second sequence. But behavioral economists interpret this as evidence of "bias" rather than the correct interpretation, which is that adults simply don't think I'm this way. I'd guess 98%+ of adults do not think either formally or quantitatively. One interesting question is how to characterize non-formal thought patterns. We frequently think about this in terms of biases, i.e., how far from "rational" is a person but less commonly in terms of "anti-biases": what is a person trying to positively capture or represent in their thought processes. This is inherently hard and psychology has a wildly hard time addressing this because it's hard to effectively control cognitive strategies as well as observe them directly.

Expand full comment

The point of a journalist is to probe - that's why the cliche is wearing out shoe leather. Just sitting there retyping the press release does not add value. If you can't understand it, you ask questions of people until you do.

Of course, not many news organizations will pay for this - most of the stories in newspapers don't require it (must fill those column inches). So when there's a story that requires this, there's no one left to do it.

Expand full comment

"Journalist seem as helpless as the average Joe ..." -- I swear, journalists are worse. I've seen front-page articles in the Boston Globe get numbers off by a factor of 1,000, numbers about things that the typical Boston Globe reader would notice are far away from plausible. But neither the reporter nor the editor caught it.

Expand full comment

"Subhead to WaPo article from this morning seems pertinent to lack of statistical knowledge and media misrepresentation: 'An American child born in 2021 could expect to live to 76.4 years, according to the latest government data'"

"Life expectancy" reporting is one of my hobby horses!

Most of the time I don't think the problem is either:

a) A lack of statistical knowledge, or

b) Media misrepresentation

I think the problem is that people *think* they know what 'life expectancy' means and the way the government data works does not line up with their definition.

I only sorted this out for myself when I saw some claims like this, knew (roughly) what the extra covid deaths looked like and then thought to myself, "This can't be right...."

But it *was* right ... given the calculation they were applying and the data they were using. Unfortunately, this isn't what 99%+ of the population thinks is being reported.

Corporate "earnings" are another one of these ... AOL Time Warner reported an annual loss of almost $100 billion in 2002. It didn't mean that AOL's treasury was short $100 billion ...

Expand full comment

Ah interesting, so I guess I am in the cohort of people confused by the statistic.

The header from the data ( https://www.who.int/data/gho/indicator-metadata-registry/imr-details/65 ) matches the subheading I was complaining about. I guess I can't fault WaPo as much as WHO. The definition seems misleading? I don't understand why that is called "Life expectancy at birth (years)" and not something like "Average age of death" for a given year.

Expand full comment

Idk how WHO defines it so maybe I'm off on the wrong track here.

But... life expectancy (per the CDC) isn't "average age of people who died this year". It's "for each age cohort in year X, calculate the fraction of the population who died that year. Then if a person lived their whole life under those age specific mortality rates, what would their expected age at death be". For example, if a plague killed 10% of people in each age category in year 2030, but 2031 returns to normal, then life expectancy would dip to 10 in 2030 before bouncing straight back to normal in 2031. Aka the "lived their whole life at these age specific mortality rates" doesn't apply very well to pandemics etc that cause a one off shock to all age groups - someone 30 today won't have the COVID death risk of 2020's 70 yr olds in 40 years time.

Expand full comment

Yea that's a good explanation, my attempt at renaming it "Average age of death" would be misleading too because it's missing the condition on age-specific death rates. As you point out, the part that is misleading (per my original comment) is the notion that it can predict the future.

Expand full comment

Agreed

Expand full comment

What you said is exactly right, but "average age of people who died this year" comes closer to a plain English interpretation of "life expectancy" than "number of years a child born today can expect to live."

Expand full comment

"I don't understand why that is called 'Life expectancy at birth (years)' and not something like 'Average age of death' for a given year."

What Aristophanes said about the definition.

Another example of this definitional weirdness is "mean time between (or before) failure" for electronics. You can find examples of mechanical hard drives with MTBFs of 2.5 million hours. This works out to around 285 years. But no one expects a given mechanical hard drive to have any chance of still being operational in 285 years.

The way to interpret the statistic is that for the near future (a few years? 10 years?) the chances of the thing failing during any given hour are 1 in 2,500,000.

But that isn't what the phrase "mean time between failures" implies ...

Expand full comment

Great example of the error in extrapolating measured and estimable short term failure rates to orders of magnitude beyond. No measurement in the short term (unless they conduct artificial aging or usage experiments) can possibly tell you anything about when the other end of the bathtub curve is going to hit.

Expand full comment

Yet another example is the misleading media use of “record profits”. If inflation is some positive number then even a stagnant business with no real growth will report “record” profits/earnings every year.

Expand full comment

That is really annoying with films cause it would be really easy to count ticket sales instead of $$$$ but they don't cause they want the bigger number every year.

Expand full comment

Here I think a significant part of the problem is the fact that non-technical writing tries to avoid repeating the same word or phrase, and thus substitutes things that *sound* like paraphrases, but actually mean quite different things!

Expand full comment

It's a flaw in humanities education that we try to teach people to express themselves both accurately and stylishly, and perhaps don't stress the importance of using the first objective rather than the second when they come into conflict. Although Law has gone that way and routinely produces work that no-one who is not a lawyer can really grasp due to an insistence on a hyper-accurate but often obscure grammar and vocabulary as a result. So there's a danger that the targeting of the humanities student mind on accuracy would just produce unreadable prose (cynically, this actually happened with a lot of modernist writers in the mid-twentieth century), which isn't really what humanities are about. Recognising you are using a technical term is however important and is perhaps not generally taught enough, other than perhaps in archaeology and linguistics (and if we could employ only archaeologists and linguists as journalists, the world would be a better place?).

Expand full comment

I don’t think this is true of humanities education generally. I think any specialized academic humanities education gets to the same sort of insistence on proper use of technical terms (or jargon, if you want to put it negatively) - this is where we get many of the worst excesses like “Latinx” and “phallogocentrism”.

But writing for a general audience aims for style, and intro composition classes try to teach that, which advanced humanities education needs to teach out of you.

Expand full comment

I disagree (as a humanities PhD and sometimes lecturer). The terms you describe are derived from the social sciences, which generally do in my experience seek to maintain terminological accuracy over style (one reason their proponents are so insistent on having the right to define their terms in public debate). The humanities, whilst being capable of producing really bad and jargon-laden writing, especially under the influence of social or hard scientific theories, actually aim to have a default readership style that the interested intelligent layman could follow the argument (if pressed, I'd say a holder of an undergraduate degree, subject unimportant). Whilst the actual content is often idiosyncratic, it's legitimate to criticise a humanities paper in review for the style being difficult to read (l have done this) and even in reviews to complain that an author has made the subject boring (not actually done enough in my opinion).

There are of course exceptions to this. But get to the root of this outside and you'll generally find an academic or department subscribing to a social-scientific school of thought and seeking to apply this in a fairly doctrinal way. The classic example would be the mostly unreadable dross about modes of production that many Marxist academics produced in the early 1980s, in a technical conversation amongst themselves based on accepting as given a model that was in reality starting to collapse in reality and in the social sciences where Weber, Foucault and their ilk were becoming the terms of reference.

Expand full comment

On writing styles, you may be interested in this:

https://www.overcomingbias.com/2009/03/deceptive-writing-styles.html

Expand full comment

Thanks. I'd argue that the classical style is the intent in humanities education with the exception that you show your workings, generally through foot/endnotes (the latter being an abomination unto the gods of good writing in my view...). The ideal work, if read aloud, is a persuasive polemic but one that can on review be checked.

Expand full comment

Why would fentanyl be an unlikely cause of death later in their lives?

Expand full comment

It's a reflection on the uncertainty of the future. It's unlikely for opioid deaths and the primary opioid being consumed to remain static for the next 70 years. Fentanyl's devastating effect has been within the last ~10 years as an example.

Expand full comment

I think people actually have developed a healthy intellectual immune response to this. It looks a lot like saying, "I don't care about your studies, I know what I know!"

Now, this has the downside that it works against both good and bad evidence. But the prevalence of charlatans makes it important that most people have a heavy status quo bias against changing their minds.

You aren't going to convince the average American about something truly absurd. You might get them to smile and nod after being badgered for long enough (so you go away), but they won't flip their positions after a few math tricks. They also won't flip their positions after lengthy correct analysis. People don't change their positions, generally speaking.

Expand full comment

Scott wrote something related to this idea a few years back: https://slatestarcodex.com/2019/06/03/repost-epistemic-learned-helplessness/

Expand full comment

This. I think people who think critically dismiss news that disagrees with their common sense (per Einstein, the set of beliefs acquired by age 18). But those who do NOT think critically fall into one of two camps: they agree with the news story if it fits with their beliefs (and dismiss it if it doesn't) or they conclude, "Huh, I guess it must be true, since they can't lie".

Expand full comment

There's a lot of people who believe obviously absurd things because they've been told them, actually.

See also: People who think that race is just a social construct, people who think gender is non-biologically determined, people who think that the 2020 election was stolen, people who think that vaccines are killing vast numbers of people, people who think that everyone who isn't a member of the 1% is poor, etc.

I think it is more likely that people will believe absurd things simply because:

1) Don't care about them much.

2) Their "tribe" tells them it is true.

3) They want it to be true.

Expand full comment

Sorry, I was unclear. People absolutely believe absurd things.

What I meant is that 'you' (literally the person reading this) are not going to convince many folks something that those people would find absurd before you started talking to them.

I think your points on when folks believe absurd things largely checks out, even if I might quibble that in many cases that fall into (2) the people involved don't actually believe the absurd thing, but they know what they're supposed to say to make their tribe happy.

Expand full comment

I think the idea that race is a social construct is actually pretty defensible. The notion of different identifiable populations with different frequencies of various genes is very real and our popular notions of race do bear a relation to that, but very badly and not close to anything yo'd come up with looking at actual population structures.

Like, we consider native Irish people to be "white" but native Gujaratis and Japanese to be part of the same "asian" race which is obviously crazy. As is considering an Ibgo, a Masai, a Khoisan, and a San all the same "black" race despite the differences there being generally larger than between "white" and "asian". We say that the descendant of a "white" and a "black" person is "black" but we say that the descendant of a "white" and an "asian" person is "mixed." Also we've taken to saying that an Argentinian of pure European ancestry, a Mexican of mixed European and native ancestry, and a Dominican of mixed European and African ancestry are all of the "hispanic" race and not actually "white" for explicable but also absurd reasons. And this is even before getting into how racial categories change between time and place, like the first Portuguese explorers categorizing Indians as "black" and the Japanese as "white."

So the racial categories we popularly use are indeed absurd social constructs that bears about as much relation to actual population genetic structures as Aristotelian physics does to real physics.

Expand full comment

Yes, this looks like people using the same word: race, in two quite different ways. That is always a recipe for perpetual conflict.

Expand full comment

Similar: rightist criticisms of Black culture get called racist by leftists, though these are (at least nominally) about culture not ancestry. "Black" refers to both an ancestry and a culture, which don't always align (recent immigrants from Africa having the former but not the latter, and Macklemore and Iggy Azalea having the latter but not the former).

Expand full comment

> Edit: Also, I got a notification saying that someone "liked" this comment. How is that possible?

The "like" button is hidden on the website here via special-case code, but is still present on the Substack mobile apps.

Expand full comment

I’m on the Substack iOS app and I can’t like any comments.

Expand full comment

Ah, then I stand partially corrected. I'm on the Substack Android app and I absolutely _can_ like any comments.

Expand full comment

I have heard that there's a browser addon that enables liking comments here.

Expand full comment

It also shows up in the email notifications about comments.

Expand full comment

I'm curious about your name. I know one Chinese girl with the unusual official name of "Langyingzi Lu", 吕郎英子, whose name is that way because (a) her father's surname is 吕, (b) her mother's surname is 郎, (c) her parents wanted to give her a hyphenated surname, (d) the Chinese government refused to recognize a new double-character surname [I asked her once "what if your name were 司马?", and the response was "that's different"], and (e) some problem specific to America must have arisen that stopped her official surname of Lü from being represented either that way or in the common Chinese vernacular of "Lv". (Chinese people call her Yingzi, which is what her name is supposed to be. Prefixing the Lang is purely an artifact of official forms.)

Is something similar going on with you?

Expand full comment

I basically have a hyphenated surname. My last name is my father’s, the first syllable of my first name is my mother’s.

You’re the first person since I started writing on Substack to notice that my name has 4 syllables instead of the standard 2 or 3. Kudos.

Expand full comment

Do people call you Sheluyang? Substack will let you have any name you want; why put it that way?

Expand full comment

Because it’s my legal name. When my parents brought me to America, they wrote it that way. It’s on all my identity documents. This is who I am.

Expand full comment

Thanks for explaining. I had a student whose name was four syllables but I couldn't ask why so I always wondered what I was missing.

Expand full comment

I would remove the word “American” in all these contexts. It’s unclear that there is any national population that is substantially better at interpreting any of these claims than Americans. (It’s quite possible that there just haven’t been relevant tests done to identify whether Finns or Koreans or San Marinoans do better, but my expectation is that even if they do, it’s not by a lot.)

Expand full comment

Presumably the author is familiar with the situation in America and less so with other countries. I read it as a clarification to where the statements are known to mean to apply.

If someone tried to carry these claims over to very non-western parts of the world, the surrounding social structure very likely is different in important ways. As the author has specified they mean an American context, the reader can now make their own judgement call how similar/dissimilar that is to other locales.

Expand full comment

Data says people liked your comment. /ducks

Expand full comment

You mention Americans a lot. Is there some other nationality which has a significantly better grasp of statistics?

Expand full comment

I think that the OP was being careful not to extrapolate without justification. An important consideration when working with data.

Expand full comment

See also _How to Lie with Maps-.

Expand full comment

Is there an article on that?

Expand full comment

Not that I know of. It's a book.

Expand full comment

There's more to that story than meets the eye. The author's name was Darrell Huff, and he did a very good job convincing people to be more skeptical of statistical claims. Ten years later, he testified before Congress on that very matter, brought the story about storks and birth rates and many others, and concluded that it was precisely the reason why there was no real link between smoking and cancer. Apparently, he was paid by the tobacco industry, and was also working on a follow-up called "How to Lie with Smoking Statistics".

I like Tim Harford's "How to Make the World Add Up", where he tells this story in the introduction and warns that, like there is a continuum between providing no context and providing all context, there is also a continuum between believing everything you read and believing nothing. It's not clear to me we, as a society, are closer to the first one than to the second.

Expand full comment

Well then, it sounded like he managed to lie about statistics, and in front of the whole Congress. Brilliant meta-example of his own knowledge.

Expand full comment

People are generally not good with relative proportions. The (non paradoxical but unintuitive) "Potato paradox" demonstrates this: A cucumber made up of 99% water is twice as large as the same cucumber with 98% water. Very useful when trying to misrepresent how many X..

Expand full comment
Dec 23, 2022·edited Dec 23, 2022

> I got a notification saying that someone "liked" this comment. How is that possible?

I've seen that once or twice, and wondered the same, as no thumbs up or down icons or like counts appear against any comments (mine or others'). I assume these are available and visible only to paid subscribers.

Expand full comment

You can like comments in the substack app.

Expand full comment

Tim Harford wrote a whole book because of he loved How To Lie With Statistics so much as a kid. Darrell Huff does not come of well, but Harford's book is highly recommended.

https://timharford.com/2022/01/how-to-truth-with-statistics/

Expand full comment

When our kids were about ten and seven we had them read that book. There was a presidential election after which someone on the losing side claimed that there were nine states where the election results were worse for his candidate than the exit polls, clear evidence of cheating. We told our daughter about it and her immediate response was "how did they choose those nine states?"

Expand full comment

The book Factfulness gives a nice and very short, catchy and accessible shot at fighting this:

‘never trust a number on its own’

Expand full comment

Pravda rarely lied. It just didn’t tell the whole truth. Factory production was up in Minsk when the Ukraine starved but only one of those truths was written.

Expand full comment

A simple web search reveals that in 2011 about 40% of pregnant women received the flu shot...

Expand full comment

Exactly

Expand full comment

did more stillbirths happen in 2021 than previous years?

Expand full comment

Yeah, at this point any claim about widespread vaccine side effects or long COVID survives or falters on this point. If it's frequently caused by <vaccine or COVID> it just has to be more common now.

Expand full comment
author

Thanks for this context.

It looks like total number of US miscarriages didn't increase much in 2021, which means COVID vaccines can't cause much of an increase since so many people got the vaccine. I think this means the most likely story is that people reported their post-COVID-vaccine miscarriages to VAERS in a way that they didn't report their post-flu-vaccine miscarriages because side effects from the COVID vaccine seemed more report-worthy since it was new. I've edited this into the post. Interested in hearing from anyone else who knows more about this.

Expand full comment

Yup, although "new" is perhaps less relevant than "COVID vaccine side effects were a big part of cultural discussion" (arguably because everything about Covid was polarised and a big part of the cultural discussion).

Expand full comment

Pregnant people are also advised to get TDAP during pregnancy, because it passes antibodies to the baby. Don't know what % of bad things that happen after that get reported, either.

Expand full comment

Can confirm. I've been pregnant a lot, and they always want the Tdap and flu shot. I don't keep an immune response to pertussis or something, so they always want me to revaccinate. I've miscarried three times, always had some vax or another during pregnancy, never reported or even vaguely considered reporting it to anyone.

Expand full comment

I wrote one blog post with some fertility data. I couldn't find a signal that the covid vaccines caused any decline in birth rates:

https://medium.com/microbial-instincts/are-covid-vaccines-causing-declining-birth-rates-f9a60c7c3f5a

Viki Male has a good FAQ with lots of studies showing that the vaccines are safe for women and don't effect fertility:

https://drive.google.com/file/d/1_wHIYX-tGkGBPwuax7N8BxZPR4PTTCDm/view

That said, I'm not sure either of those sources can rule out that the VAERS data here is a signal of real harm. That's because the signal here is very small. There are about 4 million births in the US, per year. The graph you posted said that 3,500 miscarriages were reported in VAERS in 2021. Only about half of VAERS reports are actually US data, the rest are foreign. So, maybe that's less than 2,000 miscarriages in the US.

Even if all those miscarriages were really caused by the vaccine, the rate would still be very low -- on the order of 1 in 2,000. Or likely a bit higher since not all pregnant women got vaccinated, or did so during the first trimester.

The natural rate of miscarriage is about 15%. So we're talking something like 700,000 miscarriages in a normal year. Adding 2,000 more is just going to look like noise in any population level data.

None of the vaccine trials had enough pregnant women to detect a 1 in 1,000 risk, or even a 1 in 100 risk.

And follow-up studies probably weren't large enough either (I think the largest one I've read followed 2,000 women).

We've only recently seen studies confirming that there's an elevated risk of developing dysautonomia (POTS) after a covid vaccine. The odds for that appear to be about 1 in 1,000. Patients have been complaining about that for 2 years now -- some develop symptoms resembling long covid after getting vaccinated.

There could be multiple reasons why VAERS reports surged in 2021. Heightened concern about the vaccines and heightened awareness of the reporting systems is definitely a big part of it. But it's also possible that the covid vaccines do cause more problems than other vaccines.

Expand full comment
Dec 23, 2022·edited Dec 23, 2022

Yes, it's annoying when people dismiss crackpot theories of what VAERS shows by saying "there's 100% nothing there! guaranteed! we did Phase III trials and all..." This just helps feed the paranoid because it's far too dismissive in the other direction.

The right response would be: "The VAERS data definitely do NOT show a giant effect -- and we already pretty much knew it wouldn't from the Phase III trials and the early deployment -- so if you're arguing it's some kind of Death Jab you're entirely full of shit. However, it's *also* undeniably true that this vaccine was approved quickly, and it was simply not possible to assess risk factors down in the 1 in 1000 or 1 in 100,000 probability range (depending on how rare the demographic slice is), and so it *may be* that the vaccine does have some unfortunate, even deadly, effects on a very rare basis. We'll find out, by and by. Indeed, this is *why* we collect the VAERS data -- to eventually discover even these tiny risks.

"But do bear in mind that new medicine is *always* a trade-off between demonstrated benefit and risk, both known and unknown. There is no such thing as a medicine which is both (1) effective and (2) risk-free. So we have to balance these things: the vaccine will save X lives, but maybe cost Y. What is the relationship between X and Y, and what relationship should we insist hold? These are worthwhile and sensible questions to debate."

Sometimes it's worth nothing that when the polio vaccines where approved they *knew* the vaccine would cause a certain amount of polio, because the manufacturing just wasn't good enough to 100% disable every single viral particle. But the judgment was made: this amount of iatrogenic polio is worth saving this other number of lives from wild polio. And the public knew this, and accepted it. It's unfortunate how much more hysterical we are than our grandparents.

Expand full comment

Well said.

There may even be evidence that the vaccine trials didn't monitor side effects thoroughly enough. If there's a 1 in 1,000 risk of POTS, there should have been 20 extra cases diagnosed in the vaccinated group. I'm not sure if Pfizer noticed or reported that.

There's also some indication that severe reactions went unreported. Brianne Dressen claims that her reaction was unreported and she was excluded from the AstraZeneca trial, perhaps because she was sick enough that she couldn't get a second shot.

There's also no indication that any of these reactions are common, or worse than what would have happened had any of these people gotten infected with covid without the vaccine.

As the pandemic continues, it's worth having an on-going, rational conversation about the risk/reward profile, evaluated by age, by booster number, by virus strain. Instead, we've mostly been left with another culture war debate, with polarized camps thinking that vaccines are either incredibly safe or incredibly dangerous.

Expand full comment

Our grandparents, or at least contemporary judges, were hysterical enough to hold the manufacturer of the vaccine liable for damages due to someone getting polio from the vaccine on the grounds that it did not itself make sure that everyone who got the vaccine knew the risk — they told the medical people giving the vaccine, left it to them to inform the people they were vaccinating. Also innumerate enough to compare the chance of getting polio from the vaccination to the annual chance of getting polio if not vaccinated instead of the lifetime chance in order to conclude that someone informed of the risk might plausibly have declined vaccination as a result.

Expand full comment

I think that's a valuable nuance, so thanks for that. I agree our grandparents weren't all levelheaded cool cucumbers -- people are still people. But I don't think it vitiates the main point. If you are referring to things like the Cutter lawsuites, let us bear in mind there were genuine significant mistakes made, and genuine significant harmful outcomes. That's not in the same category as people suing -- and winning $billions -- because they think baby powder caused their ovarian cancer, or the autism-MMR hysteria.

When people were seriously freaked about polio, they generally accepted the risk of vaccines. These days, serious infectious disease among the young to middle-age is so rare there is no general appreciation for its innate threat, and therefore we lack competence to measure the tradeoff between the consequences of disease and the costs of its treatment (either personally or socially).

This was borne to me forcefully by the amazing levels of hysteria and flailing about -- in both directions -- during COVID. We acted like a species that had never experienced infectious disease before, and had no general principles on which to tackle it, so we were just ad-hocking it all the time, lurching from one idea to another. God knows what would've happened to us had some well-timed new biotechnology not come along.

Expand full comment

Do you know how VAERS works, Scott?

Expand full comment

Yeah I was going to add that a large fraction would get the TDAP shot as well, and often reasonably late in pregnancy. The issue with VAERS is far more likely to do with the very incomplete and opt-in reporting rather than just the denominator.

Expand full comment

TDAP is administered in the third trimester, usually after 27 weeks, when a baby could plausibly survive outside the womb, when miscarriages are in general uncommon. I got TDAP and the flu vaccine and a COVID shot during my pregnancy.

Outcomes for my baby and myself if I got any of the TDAP diseases, flu, COVID in 2021 seemed far worse than any vaccine risk.

Scan the mommy-webs for a while and you’ll find women who believe all sorts of things caused their miscarriage. Lemongrass. Cold medicine. A bumpy ride on a bus. Given that my *pharmacist* admitted she suspects the COVID vaccine caused her breast cancer, I assume there is no shortage of women attributing miscarriages that typically would have gone unreported as caused by the vaccine.

Expand full comment

Miscarriages are very hard cases, because (I'm assuming) the majority of pregnancies nowadays are planned, so the baby is wanted and expected, and when a miscarriage happens it's very painful for the mother. If there isn't a clear reason as to what happened or why, then it's natural to look around for "I did this and then it happened". 'I'm healthy, the pregnancy was progressing fine, my doctor never said anything about any problems, then I took lemongrass/cold medicine/a bumpy ride on a bus and had a miscarriage' at least gives *some* explanation, rather than "it was random chance and we don't really know what happened, try again!"

Re: the bumpy bus ride, old wives' tales about inducing labour included things like taking a lot of exercise - or that miscarriages were caused by things like eating certain foods or having a fall. See this site:

https://www.todaysparent.com/pregnancy/giving-birth/old-wives-tales-for-inducing-labour/

"A bumpy car ride

A bouncing motion is believed to break your water, push the baby farther into the birth canal or into the correct birthing position."

Expand full comment
Dec 25, 2022·edited Dec 25, 2022

> Miscarriages are very hard cases, because (I'm assuming) the majority of pregnancies nowadays are planned, so the baby is wanted and expected, and when a miscarriage happens it's very painful for the mother.

I'm not wholly convinced planning the pregnancy out of overt desire for a baby makes the miscarriage any more painful for the mother.

I watched the following sequence of attitudes in a friend:

1. I don't want children. Why would I?

2. I didn't plan the pregnancy, but I feel that it is a gift from god.

3. (enthusiasm)

4. (miscarriage)

5. I need to find a job that will allow for another pregnancy.

6. I don't want children.

7. No, I never wanted children. Why would I?

The lesson I tend to draw from this is "a lot of people have no idea what they want".

Expand full comment

I agree with your point re timing of TDAP... except if one is being technical the graph in the OP says stillbirths (which are definitionally late pregnancy) and miscarriages. (I looked it up, ~20k stillbirths in the US per year, presumably only a fraction are reported to VAERS).

Expand full comment

I was a reporter for the WSJ & Bloomberg News for almost two decades. Here's the dirty little secret: you, as a reporter, hate to lie, and almost never (almost never: I've seen, rarely, reporters lie in print) do it. But you outsource lies to experts. Big time media has a massive cottage industry built around itself, made up of think tank hustlers and various analysts that will take your calls and give you comment for free, on anything you want, in whichever direction you want, just in exchange for the exposure and publicity they get from being published in your media. Those guys ("instaquotes" or "instapundits") are sometimes on call, waiting for you to call them at, say, 2200 on a Sunday to give you comment on a regional election in Germany. And they know very well what they have to say, the line they must take, and in fact are very careful to never stray from what you expect from them. So, if you're the Times guy writing about the school vouchers, you KNOW which instaquote to call so you get exactly the lines you are dying for, be they gross lies or just mere obfuscations. These dudes are not real sources, but if you want your bosses happy and your readers content within their thought-bubble, they are almost as important for you as a reporter.

Expand full comment

I cannot recall the name of the course that taught me to recognize certain aspects in news stories, but I can never forget the techniques. Phrases like:

"Experts agree that...." (what experts???)

"Critics claim...." (who are these critics??)

Once you learn to recognize the Sophist rhetoric employed in news and opinion pieces, it sticks with you.

Expand full comment

That's the kind of thing Sagan warned about. I wonder what he'd think of things today if he was still alive?

Expand full comment
Comment deleted
Expand full comment

You know, I've had exactly that thought. Would he have remained consistent with his vision of a purer rationality, or ended up down the path of Bill Nye or Neil deGrasse Tyson, where the political and scientific had no separation between them.

That was a trap that Sagan himself fell into with regards to the whole "nuclear winter" debacle, which he at least expressed some regret over in later years.

Expand full comment

Everybody gets stumped by the new biological realities that we are finding out more about every day, I suspect being old school Sagan would unfortunately come down on the wrong side of the cishet patriarchy.

Expand full comment

Yes, I'd expect that's how they'd attack him if he held firm to more scientific explanations.

Expand full comment

New biological realities are confusing indeed, like which half of a mammalian species can get pregnant, that stuff is devilishly hard.

Expand full comment

Probably not, Sagan has the same no-nonsense vibe like Richard Dawkins, and we know for a fact that one isn't afraid of calling out the fashionable bullshit. Sagan is perhaps less confrontational and more spiritual, but don't for one second mistake that for willingness to let bullshit and bullies win.

>Sagan is one of my heroes

This jewel in particular is the one I remember him by :

>>>If a human disagrees with you, let him live. In a hundred billion galaxies, you will not find another.

I don't know what happens after death, but I hope it's something better than nothingness solely so that Sagan and a few others like him can enjoy what they deserve.

Expand full comment

There is a Great Courses Philosophy class on logic where the professor points out that if someone is going to claim that "experts say" they should be able to name at least one of these experts.

So maybe a Philosophy logic class?

Expand full comment

Mark Roulo,

Yes, it was a Philosophy class. Thanks for the memory reminder. 👍

Expand full comment

On the one hand, it's good to make people as specific and as concrete as possible in their arguments. It's a sort of "double think" of the human mind to talk about generalities while meaning only specifics, for example I maybe talking about birds while imagining only cute blue animals in my brain, and not hawks or penguins or chicken. I remember this had a name in AI or cognitive science, but I can't nail it down. It's good to challenge and point out this.

On the other hand, abstraction and lossy compression is how the brain works. Close your eyes and try to navigate your own house by memory alone, or even remember the face of people you see every single day. You will be surprised how often you fail, that doesn't mean you're lying when you say you know these things.

Expand full comment

"On the other hand, abstraction and lossy compression is how the brain works."

I think there are a few things in play here and I'm not quite sure where you are trying to go (maybe because of lossy compression :-)), so:

*) There isn't (usually) a problem with "birds" because people do agree about what birds are. One *CAN* make a mistake by using a general term when your argument only applies to a specific case. Ooops.

But in the example of "life expectancy" we aren't getting generalized birds mixed up with specific hawks, we are conflating two different things because the NAMES are the same (or close enough...) but the concepts are different.

*) The philosophy class prof. pointing out that "experts say" requires that you know a specific expert was not claiming that you need to present that expert as part of your argument (though it certainly isn't a bad thing to do). His point was that if you make the claim and you don't know any actual experts who "say" then your argument is logically flawed because you can't know that "experts say" if you don't know any experts that "say."

*) Lossy compression is how the brain works, but if getting the correct answer is important then it does become important to define terms. Failing to do so can (easily!) result in circular arguments that never reach a conclusion because the folks involved haven't decided what they are talking about.

A current example from an other site I frequent is a "discussion" about whether strategic bombing has ever worked. The two sides haven't agreed on what constitutes "strategic" vs "tactical" bombing nor agreed on what working would look like. So we get Hiroshima and Nagasaki proposed as examples that worked and these are rejected because they (a) weren't strategic, or (b) didn't work. This isn't going to close because the sides don't agree on what the terms mean so each can be correct using whatever definition they have in mind (but are unwilling to share).

The compression here is lossy, but the lossyness (and/or unwillingness to be precise) means that the discussion can't achieve anything.

Expand full comment

X says Y.

X claims Y.

X admits Y.

X denies not-Y.

X angrily denies not-Y.

And the new one, X asserts Y without evidence.

Expand full comment

Newspapers should use the phrase "some experts say" rather than "experts say."

Expand full comment

Or "an expert told me"

My son was drilled on the logical difference between "for all" and "there exists." I'm amazingly happy that it actually took!

Expand full comment
Dec 23, 2022·edited Dec 23, 2022

Plus the classic principle of correct headline interpretation: If the headline asks a question, then the answer is NO!

That is similar to the technique often used in junk science programs about UFOs or Big Foot and suchlike. They pose a series of questions, as if being oh so open minded. For example, "Could there be alien civilizations living at the bottom of the ocean?", "Could the Loch Ness monster be real?", etc, etc, the answers to which are all again NO!

Expand full comment
Dec 24, 2022·edited Dec 24, 2022

I found _Bias_ by a former CBS(?) reporter to be very interesting in this.

Expand full comment

So, I've learned to pay attention to adjectives, especially in headlines.

Expand full comment

"According to anonymous sources within the intelligence community..."

Expand full comment

Of course, that poisons the well regarding expertise itself.

Expand full comment

Regarding go-to experts for the media, I don't want to blacken the man's name by imputing any bias to him, though he's reliably liberal (then again, he is a Jesuit) but for a good while there it seemed like the only Catholic clergyman any reporter had a contact number for was this guy: Fr. Thomas Reese. Any story about the Church in American media, and up he'd pop being quoted for his opinion on it.

https://www.americamagazine.org/voices/thomas-j-reese

https://en.wikipedia.org/wiki/Thomas_J._Reese

"Over a period of five years beginning at the turn of the millennium, Reese adopted various stances at odds with official Catholic teaching on matters such as homosexuality, priestly celibacy, birth control, and the abortion debate.

He resigned from America in 2005. The National Catholic Reporter claimed that Reese's resignation was forced by the Vatican, although America and the Society of Jesus in Rome denied this."

Expand full comment

I have so many questions about this. In your experience, what's really going on in the minds of reporters/editors? Are they aware that they're providing biased analyses? If so, how do they justify it? Is everyone in the newsroom a political ideologue who views themselves as fighting for their side, or are they just trying to maximize their readership by cynically appealing to a particular mindset? What happens when dissenting voices within the newsroom pipe up with "hey, that's totally intellectually dishonest"?

My personal theory is that basically all newsrooms have been captured by one party or the other and are essentially ideological echo-chambers and that all intellectually honest "hey that doesn't sound right" dissenters are shouted down faster than a conservative on reddit. How accurate is that?

Expand full comment

Every news room has a slant of some kind. It could be right or left, or it could be fiscal conservatism or fiscal progressivism. And then the culture stuff could be untethered from that. "The News" and "The Facts" are entirely socially constructed.

Expand full comment

This postmodern take is trivially true and easy to justify, as relativism usually is. But is that really sufficient reason to adopt it, and thus never evaluate one news room as better than another at reporting the news?

Consider the prairie dog. If a prairie dog on sentry duty spots a hawk and doesn't bark, do we say the prairie dog is defective? Or do we say the prairie dog is, according to one particular established opinion, defective? I think we just say the prairie dog is defective.

Now compare the defective prairie dog sentry with a radio station that neglects to report an approaching tsunami. Why should we now say it's just an opinion that this radio station is bad at reporting the news?

Is the difference because humans are metaphysically different from prairie dogs? Is it because of something about the human use of language, i.e. that language only ever refers to itself? Is it because the complexity of civilization makes it impossible to figure out all the ways that events can effect humans? I am not a philosopher, so I would be grateful to see the full rationale spelled out.

Expand full comment
Dec 23, 2022·edited Dec 23, 2022

The standard for the prairie dog is uncontroversially universal, so there's no question about it not fulfilling its role. But there's no universal standard for what the media should be. Or rather, there is, to be on the same side as you politically (the obviously correct one), but few people are honest and self-aware enough to admit this. Tellingly, they tend to be outraged only by misinformation favoring the other side.

Expand full comment

Right? I mean it isn't that complicated.

Expand full comment

This answer broadened my question from whether one particular thing (the tsunami in the area) is or isn't a certain sort of news, to whether there is a universal standard for what the [news] media should be, which would imply that all information can be classified into one of two categories (and excludes e.g. a spectrum model).

Let's get specific with your controversy/consensus distinction. Is there actually more controversy around a radio station not mentioning an incoming tsunami than there is about a prairie dog not barking at a hawk? They sound uncontroversially universal in equal measure to me.

So what is the actual source of the categorical difference? (And when, in the tens of millions of years separating us from our distant cousins the prairie dogs, did it emerge?)

Or else, why aren't we postmodern about prairie dogs' roles as sentries too?

Expand full comment
Dec 24, 2022·edited Dec 24, 2022

Sure, I'll grant that the failure to mention an incoming tsunami by the local radio station is an uncontroversial error. However, this sort of thing doesn't really happen in practice, and it's not the kind of problem that people are bemoaning when they talk about the crisis of trust etc. They are generally upset about the postmodern stuff, or tribalistic in other words. See the reply by Carl Pham below for elaboration of the role that the media plays, especially on the national level: https://astralcodexten.substack.com/p/the-media-very-rarely-lies/comment/11349942

Expand full comment

This comment wasn't helpful in the least. Every halfway intelligent adult is already aware of these vague truisms. I wanted specific insights from someone on the inside.

Expand full comment

Good thing you wasted even more time replying to it then, big brain stuff. Your theory of course is on the same level or even lower in effort/quality, perhaps why OP didn't respond.

Expand full comment

Let's be charitable. People are only in a few rather narrow cases willing to fork out good money for genuine unbiased information. If you want to know whether to buy this house or that, say, or if you want to know the best restaurants in Paris for a trip. And in any such case, there is a profit-making industry that specialized in providing accurate information for a fee.

But general news is in a different category. It's mostly not actionable, at the personal level. It doesn't really change what we *do*, for the most part, only what we yammer about on social media or on blog comment sections. It bears more than a slight resemblance to scholastics and bishops fiercely debating theological arcana circa AD 1600.

What general news really is, is a certain form of entertainment. It tickles our prurience or cupidity, it gives us gossip to repeat, with glee or rage, it gives us tear-jerker or heartwarming or outrageous stories so we can have a nice cathartic cry, smile, or yell at the clouds.

Since the "news" is really a branch of entertainment, it's not the least bit surprising that it caters to the tastes of its subscribers, and that varying news sources fight over certain demographics. There's a typical New York Times subscriber for whom the Times editors and reporters write, and they know pretty well what he wants to read. There's a typical New York Post subscriber who wants to read different stuff, and the editors and reporters of the Post provide it. There's what the Economists subscribers want to read, and the subscribers to Scientific American, and to Nature, too.

That doesn't mean any of them can't write the truth. It just means that expecting that to be their #1 goal is naive with respect to human nature. They will print gospel truth when it's either sufficiently entertaining or there's no more lucrative option.

Expand full comment

At some point Scientific American shifted from being "science for the intelligent layman" to "purportedly scientific articles that support left of center views." Do you interpret that as due to a change in who read it or what readers wanted, a recognition that its readers didn't actually want what it had been providing, or a rare exception to your description, an ideological takeover.

Expand full comment

That's a very good question. I subscribed as a young adult and loved it, and its demise (in the way you describe) was shocking to me. My hypothesis at the time was twofold: first, that the rise and rapid growth of other pop sciencey magazines in the 70s and 80s persuaded the owners that a great deal more money could be made by broadening the appeal by making the rigor more shallow and the tone less cool and detached, and second, that the available demographics changed, that amateur science and engineering just became less interesting to a mostly male mostly 19-45 readership.

That demographic stopped tinkering with their cars, building model rockets, taking apart or building radios, grinding telescope lenses, buying home chemistry sets and making invisible ink, et cetera, so a lot of the nuts 'n' bolts "Here's how shit works" material just became less interesting. Fewer readers were interested in being amateur scientists or engineers. Give us big bold mind-blowing ideas instead, sort of "Cosmos" in print, which we can just admire and discuss, without actually *doing* anything. (Climate change is a big bold idea. It may be completely wrong, or greatly exaggerated, but the *idea* that burning oil and coal can totally rework the climate of the Earth is a great big and bold idea for sure.)

I don't know why that happened. I might've said computers, given that today's young men pretty much spend all their spare time and mental energy screwing around with computers, building and operating fantasy worlds. But the change seemed to happen well before that, already in the late 80s and 90s. So I don't know. But...some kind of broad slow sociological change, which I'm tempted to say mostly altered the ambitions and focus of young men.

Expand full comment

Having the electronic equivalent of a rolodex full of think tank phone numbers enables reporters to get a lot of decent work done fast, with a couple of "experts" offering the article's thesis and one "expert" offering the antithesis. Reporters seldom (but occasionally) come up with a synthesis creating an overarching explanation of how both warring sets of experts' views can be true at the same time.

Which point of view gets to be the thesis tends to be fairly arbitrary, with political bias or a new study or favor bank logrolling all perhaps playing a role in determining which set of experts gets to be the protagonist in the article.

Expand full comment

also, it's in bad taste to quote the same instapundit every time (you look lazy) so you must shuffle the cards, so to speak, and keep all of them engaged and ready to take your call

Expand full comment

During the few years I worked as as a reporter a couple of decades ago, I found that academic geniuses Noam Chomsky and David Hackett Fischer ("Albion's Seed") would graciously answer my not particularly highbrow questions. E.g., for my utterly disposable article on the general topic of "What to Look for on Election Night 2000," I recall Fischer pointing out for me that Delaware had an impressive record in the 20th Century as a swing state that went with winners for reasons, according to Fischer, with deep roots going back to the 17th Century.

If I'd stuck with reporting, I likely could have recruited a wide roster of brilliant professors to provide me with quotes about current events. Fischer, for example, was pleased that I'd carefully read his magnum opus "Albion's Seed" and that I asked him intelligent questions about the latest news relevant to his life's work.

My view was: Why doesn't every politics reporter ask the great David Hackett Fischer for his view on the upcoming election?

Similarly, Tom Edsall's columns in the New York Times in the 2020s largely consist of him emailing contemporary questions to highbrow academics and posting their answers.

Expand full comment

Yup, same thing happened to me while covering Iberian political infighting. I used to call the very famous and gracious historian Stanley Payne, who was always kind and informative. Other reporters thought that Payne, who wasn't always highly critical of Generalissimo Franco, was a Fascist who shouldn't be entertained. In the end, I found that such important people are often smart and complicated and they often don't give you the exact quote that you need, or take ages to do so. Instaquotes, being faster and nimbler, work much better for the Internet age, and they always have the perfect pithy quote ready for you.

Expand full comment

Why wouldn't you (or another reporter) cut corners and lie outright that a source told them something? Does someone fact-check with the instapundit?

Expand full comment

It sometimes happens. Stephen Glass is an oft-cited example, but there are many others. But you have to make up the source completely. In big time publications, editors will check with the original source of the quote when there are doubts or questions, if they can find it. The instapundit may also complain, of course, they have a business & reputation to protect, they're players, not dunces https://en.wikipedia.org/wiki/Stephen_Glass

Expand full comment

Matt Taibbi's recent book Hate, Inc. is revelatory and how the media has changed in recent years.

Expand full comment

What about lying by omission? For example, don't report something even though it is clearly newsworthy, because it makes you or your news business or the politician / party you like, look bad?

Expand full comment

That's sort of the trivial case of misinforming via lack of context. Or perhaps misinforming via lack of context is a particular case of lying by omission.

Expand full comment
Dec 22, 2022·edited Dec 22, 2022

Lying by omission is still possible. I think the argument here is more Hanlon's Razor: Never attribute to malice [in this case 'lying', whether by omission or no] that which is adequately explained by stupidity [in this case '*misinterpreting context'].

Expand full comment

It's not very amenable to censorship, at least until things get really bad.

Expand full comment

All true. It's very, very difficult to report honestly on something you have strong partisan feelings about. Even if you try.

I'll add what anyone who's ever been quoted by a newspaper, or who has had the experience of reading a newspaper story about an event one witnessed firsthand knows - reporters just get things wrong. A lot. They misquote, paraphrase in a misleading or plain incorrect way. They report some facts, and omit other equally important facts. And omit critical context. They fall for plausible narratives from biased parties. All this, just via incompetence and ignorance, with absolutely no intent to deceive.

This happens all the time, to a far greater extent than those who haven't witnessed it imagine, even without any intent to report anything but the purest truth. This is just human fallibility.

When there are strong partisan feelings or moral outrage, so much the worse. Even when the reporter is genuinely trying to be fair (which I suspect isn't as often today as it used to be - journalists once felt a professional obligation to at least try to tell the truth; that has recently broken down).

There is no fix other than multiple voices with multiple viewpoints, and critical crosschecking of facts.

Expand full comment

I was in a meeting detailed in the NYTimes. It was sort of a radicalizing moment for me. They quoted exactly what an executive said accurately, in a meeting that actually took place and they managed to get the entire context exactly backwards.

Expand full comment

It's hard to be an instant expert. It's not surprising that reporters get context of a topic they never heard of before today wrong sometimes.

Expand full comment

On the other hand, it's a bit too convenient if every time a wider context made you look bad or reflect critically on a cherished belief, you just - wonder of wonders - get it wrong.

If someone has absolutely 0 problem with arithmetic, but every time you lend them money they suddenly struggle with fractions and mistake 1/4 for a quantity greater than 1/3 ("coincidentally" in their favor), you would be understandably skeptical.

Sufficiently advanced stupidity is indistinguishable from malice.

Expand full comment

'Sufficiently...' neat phrase I will remember.

Expand full comment

This is basically my exact experience with journalism, especially in the two times I was actually interviewed for the story.

Expand full comment

The Houston Chronicle ran a front page story about my Rice U. college bowl team in 1979 with a lot of quotes from me as the team captain. The reporter didn't take a shine to me so he made me look bad, but in an utterly fair way: by not cleaning up the quotes I gave him over the phone but instead by quoting me absolutely verbatim, including a grammatical mistake I made talking to him, leaving in all my "you knows" and "uhs," and letting me ramble on like an idiot.

I was initially peeved, but my ultimate opinion quickly became, "Well played, sir, well played."

So when I read people complaining about how reporters quote them, I recall how often reporters pass up the chance to make them look bad by quoting them precisely, and instead the reporters polish up what they were told into respectable written prose.

Expand full comment

That’s totally still “lying” if that is standard practice in the field and then they omit it. There is a context to the discussion.

Expand full comment

I love everything about this post except for the conclusion, because it's not at all clear to me that an unmoderated free-for-all would produce better results than what we're getting right now, and on priors I would expect it to perform worse. Just because there's no biased central authority in your system doesn't mean you don't end up with bias - but it does increase uncertainty about the direction and magnitude of that bias considerably.

"Different biases will cancel each other out and truth will prevail!" Not impossible, but again no reason to expect that on priors. The socially dominant strategies will generally win and have no binding commitment to the truth.

Again, the problem you're describing is real, but there's no argument for why the proposed solution would work. It's like when people say we just need to abolish capitalism. Yes, it would be great if we could solve all of our coordination problems and optimally harness all available resources for the betterment of everyone, but getting rid of the thing you're pointing at does nothing to further that goal, and in fact throws away a lot of very useful social infrastructure that could otherwise be used for exactly the goals you're aiming at.

The problems are real and complicated and will not be at all helped by getting rid of the giant central thing that makes any of it work at all.

Expand full comment

I find the concept that the "truth will out" or the "truth will set us free" to be very unconvincing.

The real argument for free speech, perhaps the only meaningful one, it that censorship by its nature requires an act of force against another, and there are very few circumstances where an matter of speech should justify such an action.

Speech most certainly isn't violence, but censoring it certainly can end up requiring violence to implement it, whether under force or arms or color of law.

Expand full comment

The truth will out if there's some way for untrue statements to be shown to be untrue, and if that process is visible and people care about it.

Expand full comment

True (LOL), but in the meantime, there may be suffering caused by the suppression. I'd always rather avoid the suppression to begin with. Not always possible, of course.

Expand full comment