254 Comments
Comment deleted
Expand full comment

That cuts against the observable evidence, however. If women and ethnic minorities entering the sciences allows for increased discoveries we should expect that the rate of discoveries has increased over the past 100 or so years as more women and ethnic minorities have gotten into the sciences. That does not seem to be the case; the correlation would go the opposite way.

Expand full comment

"[W]e should expect that the rate of discoveries has increased over the past 100 or so years as more women and ethnic minorities have gotten into the sciences. That does not seem to be the case;"

What? You don't think the rate of "discoveries" has increased over the last 100 years?

I'd say that is a ridiculous statement, but perhaps perhaps I do not understand what you mean by "discoveries" and the "rate of discoveries".

I'd say that last 100 years represents that most significant increase in the rate of discoveries in all of human history.

Expand full comment

You might have been missing the thread of this and the previous posts. We are not generating more geniuses in the sense that big break throughs are not happening as much per unit of scientists or time or whatever metric you want. It takes more scientists more time and more money to advance things now than 100, 200, 300 years ago. The discussion is about why scientific progress is slowing down, not why the rate of scientific progress is increasing.

If you have evidence that the rate of scientific progress is increasing I expect Scott and everyone here would love to see it. I would.

Expand full comment

The original comment is deleted, but looking at the rate of discovery per scientist feels like the wrong way to assess whether opening up science to women and minorities led to more progress. The way we'd expect that to help would be by increasing the number of people who are scientists (and possibly increasing the average quality of scientists if we think that demand for scientists is inelastic and greater supply pushes up required quality).

This is hard to measure, but Eric Hurst & friends have a paper arguing that something like 20-40% of economic growth since 1960 can be attributed to opening up professional occupations to women and minorities.

Expand full comment

It is totally possible that things would have been worse had women and minorities not been allowed to be scientists, and the rate of progress would be even lower than it is. One has to make that argument though. The original comment did not, but rather said that it has been a big boon. If it has been a benefit, that benefit has apparently been overwhelmed by negatives.

Economic growth is very different from scientific advancement as well. It is much easier to produce more stuff by putting more people towards, especially because it is easy to move from low to higher productivity occupations for people in terms of knowing which is better. Higher pay pretty consistently suggests higher productivity. In science it is really hard to do that. You don't see really smart biologists changing careers to become political scientists because that is where all the productivity is. (Arguably lots of failed mathematicians becoming economists has made the field worse, but that's another issue.) So even if it is true that a third of economic growth is attributable to bringing in women and minorities, which I would buy as reasonable, it isn't at all clear that it should likewise apply to science.

Expand full comment

I totally agree with you that improving economic output by increasing the number of people available for high-productivity occupations is much easier than increasing scientific progress in the same way. And, of course, measuring productivity is much easier than measuring scientific progress. I'm also happy to believe that the deleted comment was totally wrong :).

One small thing I'd still push on though is that, in my understanding, scientific progress per se hasn't necessarily slowed down. My understanding is that the _number_ of scientific advancements per year has, in fact, increased rapidly. With the example of Moore's Law, increasing a constant rate of increase on the number of transistors per chip means that the number of chips added to transistors in each year is dramatically increasing. Likewise for crop yields. Likewise for the number of patents and research publications per year.

The issue is twofold: first that the rate of increase has slowed down across many scientific domains, and second that the number of researchers employed to produce that rate of increase has increased dramatically (so that, as you say, discoveries per scientist have fallen).

My point is just that the way we'd expect expanding the talent pool in science to be helpful is that it would allow us to add more researchers, not that it would make those researchers dramatically more productive. If anything, we'd expect opening up research to more people to make the average researcher less productive if we have diminishing marginal returns to research effort. So: expanding science to include women and minorities could very well be a dominant factor in our ability to maintain scientific progress over the pas half-century without it having had any positive effect on the difficulty of creating scientific advances.

Expand full comment

"If you have evidence that the rate of scientific progress is increasing"

What is the evidence that the rate of scientific progress is NOT increasing?

I see no evidence that scientific progress is decreasing.

The number of geniuses increases with population. 0.1% of the population are geniuses. That has always been the case.

It takes more money ... compared to what? There is more wealth.

The big men of science is a kind of hagiography.

Expand full comment

https://www.gwern.net/docs/economics/2019-cowen.pdf

There you go. Lots of discussion about the rate of scientific progress decreasing over a variety of metrics.

Also this discusses the issue: https://www.researchgate.net/publication/347395640_Title_Scientific_Progress_and_Rate_of_Progress

Also also: This subject is talked about a fair bit on this blog.

Expand full comment
Apr 1, 2022·edited Apr 1, 2022

I don't find either of these at all convincing.

I will not claim that regression is not possible.

And if the claim was grounded in something like the process of growth that Geoffrey B. West writes about I make take it more seriously.

And if someone suggested a coefficient like "what we know" divided by "what we know we don't know", I would certainly entertain the notion that we (humanity) have been in a constant state of getting dumber.

But if we divide human history into let's say 35 chunks - maybe ~5000 year periods, I would say there is no evidence that the rate of scientific progress could in any way be said to be decreasing.

Expand full comment

The number of journal articles has increased exponentially.

I believe the number of significant discoveries has declined dramatically since about 1970. Certainly the life of the average American has changed less in the past 50 years than in any other 50-year period since America was "discovered".

Expand full comment

What? Fifty years ago there was no personal computing, internet, sequencing of the human genome.

Expand full comment
Apr 3, 2022·edited Apr 3, 2022

The modern world was roughly in place in the 1920s-1930s; everything after electrification* has been a footnote.

*Except antibiotics.

Expand full comment

And vaccines?

Understanding DNA?

Nuclear weapons?

Computing?

The internet?

Birth control?

Agricultural improvements?

All footnotes?

Expand full comment

No; women and ethnic minorities entering the sciences doesn't magically create more jobs in the sciences.

But also, the increase in the number of scientists over the past century is already much, much greater than a mere doubling.

Expand full comment
Comment deleted
Expand full comment
Apr 1, 2022·edited Apr 1, 2022

Mm, i was under the impression that the idea that most hunter-gatherer societies were egalitarian is pretty discredited, differently than what implied in the presentation

Expand full comment

No idea, but I would assume that hunter gatherer societies would find it harder to develop the large economic surpluses necessary to support a hierarchical society.

Expand full comment

This is not the case. Some like the Pacific Northwest Native Americans had substantial surpluses and conspicuous consumption, but anyway..

Expand full comment

How were their societies arranged? Did they have a priest class? Dedicated warriors? Clearly defined leaders?

Expand full comment

I was delighted to find out that the opening weekend of deer hunting season in Michigan is an especially prosperous weekend on the Magnificent Mile in Chicago because it has become traditional for the wives of deer hunters to flock to the stores to do some expensive gathering.

Expand full comment

What delighted you about it?

Expand full comment

Middle class American hunters and gatherers.

Expand full comment

Rather than political effects or the mechanical effects described herein, I think there are also effects relating to ideology: amateur scientists don’t exist as much because people don’t believe amateur thinkers can provide value anymore, and so they don’t even try. This is distinct from the aspect of only believing credentialed figures when told things, though it is related.

There are strong arguments to be made that a number of scientific fields are wrongheaded in some fashion. In the 1980s, doctors were telling people to avoid SIDS by having babies sleep on their tummies, and now they insist quite strongly the *exact opposite is true.* Numerous “paradoxes” around infinites seem to indicate, at least to some, that maybe we are on a false assumption or two there. Professional physicists have failed to reconcile GR and QM for decades.

The mechanistic model here doesn’t address the “revolution” problem of science: where some philosophical or other assumption is overturned by some “brilliant” new idea (ones that can often be more common with amateurs than professionals - Einstein being a patent clerk is a good example.)

Expand full comment

It's not enough that current fields are wrong in some way, it has to be a way that an amateur has some real chance at correcting. I don't know anything about SIDS, but I expect it would take a lot of work for an amateur to even properly understand the open questions when it comes to infinities or quantum gravity, having occasionally tried to understand these myself, and that without this work there is no chance of them making a contribution.

What paradoxes do you think infinities have that modern mathematics fails to resolve?

Expand full comment

Whilst I'm not qualified to write of hubris in the medical field, there are a lot of 'boots on the ground' folk who make a lot of good observations, but those folk are not Dr. Fauci level, thus their observations get blown off.

For instance, I work with a guy (non-medical) who ran a COVID testing clinic for a while. Today, he's a true believer in masks—I think he was luke-warm before. But he says "I checked in 91 positive cases in one day, and didn't get sick."

Expand full comment

I think it's going to depend a lot on the field. All fields will have some degree of politics to them, because humans are human, but in some fields approximately *all* the barriers are political (eg. many humanities), whilst in others, the knowledge barriers are extremely steep (eg. maths, theoretical physics), or the financial ones are (many experimental sciences, but especially medicine).

Personally, I studied experimental particle physics - while knowledge is somewhat of a barrier there, the biggest obstacle by far is that getting any useful data requires massive machines that take hundreds or thousands of people to build and maintain and cost potentially billions of dollars (I think the LHC is more expensive than most experiments in the field, but the others aren't cheap and they're complementary - you can't do what the LHC is doing without spending billions.) Theoretical particle physics, by contrast, needs little more than a good computer, but the maths is brain-meltingly difficult and all the obvious ideas have been published decades ago.

Expand full comment

> amateur scientists don’t exist as much because people don’t believe amateur thinkers can provide value anymore, and so they don’t even try

There are people like this, the trouble is that they tend to be cranks. Cranks are characterised by an excessively strong belief in their ability, as amateurs, to make great scientific discoveries.

Is there a sweet spot between academic insider and kooky crank where you can still make important discoveries? Well maybe, but in most fields you're going to have to spend a lot of time catching up with the current state of the art before you can even find a problem worth thinking about.

Expand full comment

I think another cultural piece here is that it only seen as allowable for cranks to contribute. Sober/serious people just focus on their businesses or what have you rather than trying to email Sean Carrol or what have you.

Expand full comment

Yes, but the very amateur discoverers were probably seen as cranks by their peers. What do you think of some dude who spends his time polishing little glass beads and and trying to look through them? That's van Leuwenhoek trying to make a compound microscope ... but to his neighbors, he's some crank who polishes little glass beads.

Expand full comment

And the same person can simultaneously be a revolutionary legend in one field and a crank in several others: https://en.wikipedia.org/wiki/Isaac_Newton%27s_occult_studies

Expand full comment

Regarding your SIDS example, I'm no scientist, but I was a defense lawyer long enough to see that medicine is science plus something that isn't science.

Claiming certainty about the human body even in a particular case is usually stating too much.

Certainty about general health advice is so far removed from science it's better seen as akin to trends in fashion.

Expand full comment

That just means medicine isn’t fully accurate yet. The same is true in all sciences, it’s just that physics has come a lot further than medicine.

Expand full comment

It is really difficult for an amateur to understand state of the art in a specialized discipline, let alone improve on it. You need instruction from people who'll correct your misunderstandings and point out helpful literature, at least, and at that point you're in a PhD program.

Perhaps the premodern equivalent of a PhD program was just chilling with your philosopher buddy, but that is no longer enough.

Expand full comment
Apr 1, 2022·edited Apr 1, 2022

> amateur scientists don’t exist as much because people don’t believe amateur thinkers can provide value anymore

I think this is true. I also think that this belief is almost certainly correct.

I have an academic position in a world-leading astrophysics department. As such, I receive a LOT of unsolicited theories from amateur thinkers, almost all of whom believe that they have produced new insights which will push physics forward. It's always very high-concept stuff, like using quantum theory to do away with Dark Matter, or combining electrodynamics with relativity to disprove the Big Bang. Etc.

I do tend to read them fairly carefully, and without fail these authors have a very poor understanding of the topic at hand. The best of them make silly mistakes which would be obvious to an undergraduate; the worst of them seem little more than schizophrenic ramblings.

I think we're past a point where amateur researchers can usefully contribute. If you want to do cutting edge science, you need a more experienced scientist to show you the layout of the field, recommend relevant literature, and (importantly) correct your wrong ideas. I was in a top PhD program, and almost all of my cohort had the experience of coming up with what felt like a smart new idea, and being told 'actually Smith et al. tried this back in 1978, and it doesn't work because [x/y/z]'. Without this guidance, even smart people are going to be hopelessly lost. And once you get this guidance, you're no longer an amateur.

Expand full comment

Just to chime in, I think the same is true in biology. A lot of resources needed even to answer relatively simple questions. To answer hard questions, you need an extremely strong foundation based upon mentoring and a lot of study that an amateur would have a tough time achieving.

Expand full comment

I think this is probably more true the harder/more scientific gets, but it certainly doesn't hold for everything. A friend of mine is a mechanical engineer, and has published some linguistics work that has legitimately pushed forward the frontiers of knowledge regarding the language in question, even in the estimation of professionals studying the same (and related) languages.

Linguistics may not be astrophysics or biology, but there's still rigor to it, and it's still possible to be obviously wrong in a way that isn't true of, say, philosophy.

Sliding more toward the philosophy end of the scale, one of my favorite pieces of history writing (Six Frigates) was done by an amateur, and it's well regarded by scholars, too.

Both speak to the accuracy of the foraging metaphor in different ways, I think.

Expand full comment

I suspect that what you really want is someone who is outside this discipline (so they don’t share all the same starting assumptions as everyone else) but has been trained in another discipline (so that they have the discipline of thinking rigorously).

Expand full comment

I agree with this, but note that some disciplines still just have insane amounts of existing knowledge to absorb first. I think people with the properties that you describe are far more likely to come up with genuinely good ideas than most amateurs, but will still meet the rebuttal "actually Smith et al. tried this back in 1978, and it doesn't work because [x/y/z]" fairly often. Coming up with good ideas is only one part of the problem, they also have to be novel.

Expand full comment

I think there may be more wiggle room in the humanities for this; you can be an amateur sitting at your desk going in minute detail over old publications and chasing trails along new lines of thought as a secondary interest to your main job. There is still room for a Michael Ventris in these fields:

https://en.wikipedia.org/wiki/Michael_Ventris

Once scientific advance has gone beyond "work it out with a pencil and paper", you really can't do that on an amateur basis; as noted above by several, you need the labs for the practical work and the advisers to steer you away from dead-ends.

Expand full comment

If you’re going to be the amateur genius who deciphers Linear B, it helps to have an Alice Kober do 20 years of heavy lifting on the project before you get started.

Kober was a Columbia-educated professor of Classics who spent nights and weekends for decades doing the kinds of frequency studies you could now do in seconds with a computer. She made 180,000 index cards.

Kober collaborated with other specialists, but didn’t publish about her work on Linear B until she’d been working on it for 15 years. She won a Guggenheim Fellowship to devote herself to the problem full time for a year. And then, perhaps on the brink of cracking Linear B, she died. Michael Ventris and his collaborators inherited all the resources she developed, which they acknowledged.

It’s not like Kober wasn’t recognized, but Ventris got the lion’s share of the fame while it could be argued that Kober, who was no amateur, did the lion’s share of the work.

Expand full comment

Are you willing to mention the specific linguistics work?

Expand full comment

Sure—he's been doing comparative linguistics between documented varieties of the Nivkh language (or Gilyak, in some sources; it's nearly extinct but was historically spoken in the Amur basin and on Sakhalin Island).

"Application of the comparative method to vocoid sequences in Nivkh" should get you going in the right direction, if you want to know more. Most (all?) of his Nivkh-related output is in Kansas Working Papers In Linguistics, but he did get a paper on Quechua into IJAL.

Expand full comment

Einstein might have been a patent clerk, but he already had university education and physics and was also working on a PhD dissertation while working; Einstein is a good example of a young person creating a revolution, but not a strong example against credentialism or anything. (I'm getting this info from https://en.wikipedia.org/wiki/Albert_Einstein#Patent_office). Also, his patent clerk job was still actively involved his science background.

Expand full comment

If he was merely working on his PhD, then he didn't yet have the credential.

Expand full comment

Sort of. "Physics grad student" is a weaker credential for competence in physics than "physics PhD", but it's a stronger credential than "random patent clerk", and physics grad students are taken seriously as producers of physics research in a way that patent clerks with no particular formal background in physics are not.

For example, I'm pretty sure that today, reputable physics journals publish a lot more papers written by physics grad students than papers by clever laymen.

Expand full comment

Someone "working on a PhD dissertation" has already qualified for at least a Masters, even if one refuses to factor in the x% of a PhD.

PhD students are actively expected to produce and publish novel research, it's literally a requirement to graduate; it's just that these days "incremental improvements" are proportionally even more of the publications than ever before.

Expand full comment
Apr 2, 2022·edited Apr 2, 2022

I think the more important question is, are there no longer opportunity (and/or motivation) for brilliant 20-somethings to earn a living (in a job that fits their education), work on PhD dissertation, and still have enough free time to let their mind wander and think about fundamental problems of the science?

The most common complaint I hear is that if you are a PhD student working in "a lab", great deal of your time goes to either teaching assistant duties or making experiments your professor instructs/orders you to do.

People call it an master - apprentice model of science education and think it sounds nice and old-timey. Very few people remember that apprenticeship was often grueling work done on very unfavorable contract terms. Benjamin Franklin ran away from his brother's print-shop, and founded his own in Philadelphia (and became successful). So did Rousseau, apprentice to an engraver in Geneva, who ran away and found himself a protegé and later lover of random French noblewoman.

Expand full comment

It varies by country and by discipline. In my case, all my time was for research towards my thesis, though classmates who weren't still living with their parents spent a not insignificant amount of time teaching undergrads for money. However, doing research directly necessary for my thesis was a more-than-full-time job, so idle chats about the great mysteries of the field were rarer than I'd have liked.

By contrast, my girlfriend did a PhD in literature, and describes a positively relaxed workload.

Expand full comment

In the Days Of Discovery, a scientific education—any education—was available to only a very few. Thus Age Of Discovery people whom we consider amateurs' were the very people who would have vocational degrees in today's world.

Expand full comment

Perhaps "revolutions" happen when a new territory is opened up, rather than discovering another idea within the existing territory.

Expand full comment

Einstein was no amateur - he was a graduate student in physics before he got his "day job" of being a patent clerk.

Expand full comment

One other place where this theory would predict a difference is in the artistic domains. Since we explicitly value novelty, we don't run out of easy melodies like we do easy scientific discoveries.

Music fits this theory in some ways (the most acclaimed artists are young) but not in others (to succeed you need to dedicate all your focus).

Unfortunately, these areas are subjective so determining decline is impossible. But if decline is real we would expect a steady decline in artistic greatness over time.

Expand full comment
Apr 1, 2022·edited Apr 1, 2022

Though modern instruments also represent an unforaged area. Beethoven did not have a turntable, Bach did not have an amp. And I don't have stats on this, but I'd suspect modern classical composers (movie composers?) tend to middle-aged.

Expand full comment

We do have a lot of “genius” pop and rock stars though. That usually start young. Would they not count?

Expand full comment

I think the degree to which novelty is valued in the arts is overstated. It's valued by (some) critics; not so much by the wider population. At the very least, insofar as they care about novelty, the audience only care about novelty relative to what they've experienced before themselves, not whether something is objectively novel, so the good ideas can be mined again and again with each generation, rather than being depleted.

You can't be considered a genius in the hard sciences by rederiving all of Einstein's equations; John Williams can become one of the most celebrated orchestral composers of his day by (masterfully) emulating the techniques, and in some cases tunes, created by his elders like Gustav Holst and Eric Korngold, in part because most moviegoing audiences who latched on to Williams had never really heard much Holst or Korngold.

(Maybe the argument is a bit facile, but it's easy to construe the much-discussed remake/adaption obsession of cinema and television in the 21st century as the ultimate consequence of this. All the 'best' ideas, in terms of pass appeal, *have* been found, so now they're just getting recycled over and over instead of barrel-scraping for new ones. A simplification, I think, but I believe it does point at something real.)

Expand full comment
Apr 1, 2022·edited Apr 1, 2022

My problem with this model is that human genius should be at least semi-constant on at least a per capita basis if it's primarily genetic. If it's primarily environmental then you should expect to be able to produce it in a way we haven't been able to. If it's a combination (like I believe) then you're waiting for the right environmental conditions in which genius can express itself.

However, this has a marginal effect. In the worst conditions one or two geniuses will shine through. In moderate conditions a few more. In abundant conditions many. But once you have many geniuses it makes sense to specialize. When you're one of twenty scientists in England then it makes sense to do five things and to make foundational but ultimately pretty rudimentary discoveries about them. When you're one of twenty thousand then it makes sense to specialize in specifically learning about... I don't know, Ancient Roman dog houses. This creates more and higher quality knowledge. But it creates less towering geniuses.

Further, keep in mind you don't have to outrun the bear, you just have to outrun your competition. You can get a lot wrong and so long as you're relatively more correct you'll do well. This also explains how amateurism decreases. A few hundred years ago I'd probably be able to make some serious contributions to a variety of fields. Now I can't. Not because I do not know those fields or have interesting thoughts about them. But because now I don't have to contend with a few widespread amateurs. I have to contend with several thousand people who have spent their entire lives studying whatever as a full time, professional job.

Expand full comment
founding

This all seems reasonable as far as it goes, but maybe the impression that we are producing fewer geniuses nowadays is more due to relatively contingent and parochial-to-humans facts about the sociology of fame assignment within groups (as hinted at briefly in point #3) than it is about great feats of insight, or what make for good examples of creativity or impactful problem-solving or whatever.

In the forager analogy, the other foragers considering the problem of who finds good fruit sources are only able to consider foragers that came to their attention in the first place, and that could be due to reasons other than the actual fruit-harvesting (especially if the fruit harvesting had impact as hard to quantify in isolation from frame as does that of scientific genius).

Expand full comment
Apr 1, 2022·edited Apr 1, 2022

>maybe the impression that we are producing fewer geniuses nowadays is more due to relatively contingent and parochial-to-humans facts about the sociology of fame assignment

Right. This debate seems to actually be about "What makes geniuses celebrities?" One thing that helps make geniuses celebrities is having been born before the 20th century. 20th and 21st century media have made it harder for geniuses to compete with non-geniuses in celebrity space.

Expand full comment

How much of our definition of "genius" depends on celebrity, though? Certainly there have been some famous-with-the-general-public-in-their-time scientists, but would a lot of the people we now label as geniuses get recognized on the street when alive?

Expand full comment
founding

Well, "fame within groups" doesn't necessarily imply fame-while-alive, or fame-with-the-general-public. For whatever that is worth. Scott's Example #3 was contemplating the politics of small research groups.

Expand full comment
Apr 1, 2022·edited Apr 1, 2022

My point is that our awareness -- though not our definition of -- genius depends on fame. How can one person take stock of the number of geniuses across hundreds of fields without using fame as a heuristic?

> would a lot of the people we now label as geniuses get recognized on the street when alive?

I suspect many of their names would have been known by upper and upper-middle-class people. All the examples cited in "Contra Hoel": Newton, Mozart, Darwin, Pasteur, Dickens, and Edison were celebrities within their lifetimes.

Expand full comment

There are a lot of examples of famous geniuses who died in obscurity, but they almost all died young.

In 1975, my high school teacher said, contra the Romantic notion that great artists aren't appreciated within their lifetimes, that practically everybody who is famous today was famous within three score and ten years of his birth. E.g., Van Gogh only sold one painting in his lifetime, but if he could have lived another 33 years to age 70, he would have been rich.

Vermeer _might_ be a counter-example. On the other hand, he seems to have been appreciated enough in his own lifetime to have the luxury of working extremely slowly, turning out only one or two paintings per year. But then he died fairly young, and soon after Louis XIV of France attacked the Dutch Republic, which ended the Golden Age of Dutch painting as the economy shrunk.

Vermeer's repute, whatever it was during his lifetime, then faded, although Paul Johnson says that a small line of connoisseurs passed down word of Van Eyck's greatness for almost two centuries until he was generally rediscovered in the Victorian age.

Expand full comment

Here's a way to measure this question testing a pre-selected sample: How many Manhattan Project scientists would be completely unknown walking through, say, Grand Central Station in the decades after 1945.

I'll give you my subjective opinions as somebody born in 1958 who isn't bad at remembering what was common knowledge and what wasn't. But I'm not a scientist, so the following doesn't reflect the opinion of a professional.

To the extent that Einstein was involved in the atom bomb project, yes, he was an immense celebrity, as famous of a face as Marilyn Monroe.

I think Oppenheimer would have attracted attention from a not insignificant fraction of the passers-by. He was a very distinctive looking man with a gaunt Cillian Murphy-like face (I suspect Christopher Nolan is making his "Oppenheimer" biopic to give his friend Murphy a major role.) Of course, much of his fame/notoriety derived from the hoopla over his security clearance being stripped in 1954 due to his many Communist friends.

Von Neumann was less unusual looking, but he was on TV a lot before his early death.

Fermi's picture was widely shown.

Teller was on TV a lot in the 1970s arguing in favor of nuclear power and the like.

Bohr was a giant, but I have no recollection of his face. Same for Bethe, Wigner, Szilard.

People that aren't all that famous anymore like Seaborg might have been on TV a lot.

Feynman is a folk hero today, but I can recall reading James Gleick's magnificent full page obituary for Feynman in the New York Times in 1988 and thinking to myself, "Wow, this guy was awesome, why did I never much notice him before?" Obviously, Feynman was a legend among physicists while alive, but I don't think he made much impact on the public until the space shuttle explosion hearing soon before his death. And, even then, I wasn't aware of his now legendary O-ring dunking in ice water demonstration until his obituary.

Expand full comment

>20th and 21st century media have made it harder for geniuses to compete with non-geniuses in celebrity space.

This sounds plausible. New media (radio/TV/internet) does seem to have more of an effect on the fame of athletes, entertainers, charismatic politicians, and talking heads than on the fame of scientific innovators and sophisticated theorists. This is most likely a shift from the prior era, when fame was built on word-of-mouth and journals/newspapers.

One assumption here is that there is some limit to the number of people we can meaningfully know by reputation, at least without dedicated memorization. I guess this would be similar in effect to Dunbar's number (but presumably not similar in cause).

Scientific innovators do get some new media coverage. But the increased specialization of these innovators' fields, mentioned elsewhere in this thread, makes their achievements less comprehensible, even when explained well on slickly produced video. And so we probably retain less of what we do learn about these innovators. Which would exacerbate the new media effect.

Expand full comment

I think the other thing is that often new scientific achievements are done in teams. Much harder to remember 5 names than the 1. Can we meaningfully remember the scientist that developed mRNA vaccines for COVID? No, because they're scientists plural.

Expand full comment

I think there is some evidence that support the idea that ML researchers make breakthroughs at younger ages. The classic example would be Ian Goodfellow who invented GANs while a grad student. Also the Turing Award winners, LeCun, Hinton, Bengio, all did their seminal work while much younger.

Expand full comment

I don’t buy it. This assumes the subset of the space that’s been searched is a significant fraction of the total space (even if you just consider the “easy” subset). If it’s small, you can always just move slightly to the frontier of the set and find new low-hanging fruit. There’s no reason a priori to assume that this region should not be huge.

In my area, theoretical physics, I see plenty of interesting research problems that are no more difficult than problems a generation or two ago. In many cases, the problems are easier because we have much more powerful tools.

I do, however, see the field unable to pay grad students, unable to get grant money relative to other fields, hemorrhaging good students to outside of academia, trapped in a nightmare of academic bureaucracy, and with an increasingly large number of outright crackpots.

Expand full comment

I suppose that one of the problems with modern theoretical physics is the enormous cost of the machines necessary to experimentally confirm the theories.

Expand full comment

Or a lack of sufficient insight and creativity to imagine low-cost experiments that could do the same. What did the microwave antenna cost, which Penzias and Wilson used to provide probably the single best piece of evidence ever for the Big Bang? Had you asked someone in 1960 what an experiment that would provide solid evidence for that theory might cost, there's a pretty decent chance he would have named some exorbitant figure -- because the idea of just pointing a big microwave antenna at an empty patch of sky hadn't occurred to anyone.

Expand full comment

I had dinner with Nobel laureate Robert Wilson in the late 1970s. He was very modest about how he didn't know that his discovery of universal background radiation proved the Big Bang Theory until some Princeton physicists explained it to him and Penzias. Princetonians have complained about that Nobel ever since.

But as another astronomer asked me a few years ago, "Do you think it's a coincided that they gave the Nobel Prize to the best experimental radio astronomer of his generation for his greatest discovery?"

Expand full comment

In physics I think that a large fraction of the space has already been searched. That is, for pretty much any physical phenomenon you can think of, we have a pretty good explanation. A few centuries ago a layman could sit around and wonder "What the heck is the sun?" or "What's the deal with fire?" or "Where do rocks come from?" but nowadays you can find pretty good explanations to all of these in books aimed at eight-year-olds. We've harvested pretty much the whole space of questions that an interested layman might be able to think of.

The only questions we still can't answer tend to be either (a) extremely obscure questions that only an expert could think of, or (b) possibly in-principle unanswerable like "What happened before the big bang?" and "Why is there something rather than nothing?"

Expand full comment

I'm not sure this is true. From things I know about: somnoluminescence does not have an accepted explanation. The Russian toy that turns one way only is easy to simulate (and the simulations do agree that it spins in one direction only), but there is no model that describes the motion (without the need to integrate) in a way that makes it obvious that it will soon in one direction only.

Expand full comment

Not really.

What the heck is turbulent flow and how to predict its behavior?

What is ball lighting?

Why is the sun's corona way hotter than the surface?

Why can you wring gauge blocks?

What happens to the airflow inside a blues harmonica when you bend notes?

Why do chain fountains work, exactly?

Can you jump into a supermassive blackhole and survive, as the gravitational gradient wouldn't tear you apart? What would happen from your perspective?

Expand full comment

While sounding simple, these are all (sounding to me) as derived and deeper exploration spaces of the much more accessible and understandable questions, like: "why do things fall" "what is water" "what is sound" questions. To see black holes we are "standing on the shoulders of Giants" and to answer many of these we need tech and knowledge invented by exploring the much more simple questions.

Expand full comment

I'd never heard of wringing gauge blocks.

https://en.wikipedia.org/wiki/Gauge_block

You're welcome.

Expand full comment

Barring perhaps turbulent flow, these questions seem much more esoteric than the kinds of questions that you could reasonably ask 2-300 years ago. Like, they could say, "What is lightning," you ask, "What is a super-rare kind of lightning that almost nobody has ever seen?"

Expand full comment

That's because you are privileged with knowledge a person 2-300 years ago would have lacked. Everyone as he gains information finds the questions that puzzle someone at a lower level seem simple, and questions that puzzle him at his current level seem complex. But what's "simple" and "complex" change predictably with your current perspective, just like what's "poor" and "rich" change with your own current income.

Expand full comment

I mean, I think it's not. Isn't my example pretty clearly an example of something that's just vastly more esoteric by any standard? We have explanations for almost everything that we commonly encounter, and the things we lack explanations for are almost entirely things that are ultra hard to observe. This was not true 300 years ago.

Expand full comment

A set of gauge blocks is under 100 dollars, and they're routinely used by at least a 6-digit number of workers in the US. Not exactly rare.

Some of the others are certainly bad examples though. We can set up accurate differential equations to describe turbulent flow physically, but they're just not mathematically well-behaved. They're not SOLVABLE of course, as complex differential equations tend not to be, and they're absurdly sensitive to initial conditions and simulation error.

This isn't a flaw in the equations though; it's them accurately representing the state of nature. Turbulent flow itself is horribly behaved in much the same ways, and virtually impossible to force replicable flow outside of the most carefully controlled lab environment.

Expand full comment

This. Reading up on gauge blocks, since I'd never heard of them before, is the molecular adhesion that mysterious? It seems like the sort of thing where the details might be hideously hard to compute but the gist of it is simple enough.

Expand full comment

Well, we know the equations for fluid flow, and have done for over a century. The fact that they aren't practically solvable is the issue. This one is one I could see an "amateur" solving, in the sense that I expect advances to come either from an obscure mathematical advance (if analytic) or from an advance in computing (if a numeric approximation) rather than from physics directly.

As for black holes, that's in the "literally untestable even in theory" category - easy enough to say that you'd survive to the event horizon if the grav gradient were sufficiently low and your craft sufficiently shielded (the accretion disk gets extremely hot), but the very definition of an event horizon is that what's inside is unknowable.

The others are much further from my wheelhouse, and I haven't even heard of gauge blocks or chain fountains, which rather argues against them being the kind of phenomena laymen think about deeply.

Expand full comment

Another one that took forever to get a good answer for: how does ice skating work?

https://www.vox.com/science-and-health/2018/2/13/16973886/why-is-ice-slippery

Expand full comment

You should check out Steve Mould's youtube channel. He's got lots of videos about this kind of physics of random everyday stuff. Sometimes he comes up with a convincing explanation, occasionally there's still a lot of mystery left about how the thing works. I bet there's at least a few videos there where a detailed explanation of exactly what's going on would be new to science.

Also: No one knows how high temperature superconductivity works, though I guess you'd say that's not accessible to laymen.

Expand full comment

given the price of high-temperature superconductors, no, I think you need to be a fairly well funded lab to do any experiments on those.

I will, however, not that a lot of historical "lay scientists" were aristocrats who could throw rather a lot of money (for the time) at their experiments, so maybe it's not entirely fair to talk about purely financial barriers to a field?

Expand full comment

If the problems are easier because we have much more powerful tools, then you don't get as much genius cred for solving them.

Expand full comment

"This assumes the subset of the space that’s been searched is a significant fraction of the total space ... ."

It assumes that searching the subset of space that has been searched has consumed all the "search energy" of the searchers to date. It says nothing about outer limits.

Expand full comment

What's your starting point for "a generation or two ago"? I might agree that many of todays' theory papers could have been published in 1990 if the computer tech were available, but also I have the impression, in my field at least, that major theory advances have been slow since the 1960s.... (which is not to say it's the fault of theorists at all, but rather that many of their theories are extremely hard to test and there's seriously diminishing returns for the 100th untestable theory to explain something). Some of it is also just luck - Super Symmetry is a beautiful elegant theory, and when it was first thought of it seemed quite plausible that it would be true, but when the LHC finally got running, there was no sign of any SUSY.

Expand full comment

Scott, I think this model has less explanatory power than your previous* model, because it fails to account for discoveries which make other discoveries more available. For example, had Newton invented calculus but not the laws of motion, this would have reduced the depletion of the forest in that area, because some things which *could* be discovered without calculus are much easier to discover with calculus. Maybe you could throw in something like builders (teachers in real life) who build roads which make other areas easier to reach?

The point of this is that a more innovations in whatever makes things easier to understand, maybe educational psychology (if its effective at making better learners at scale, which idk) will reverse this trend, and the model should have something to reflect that

*https://slatestarcodex.com/2017/11/09/ars-longa-vita-brevis/

Expand full comment

I agree with this argument. Scott's model assumes scarcity - scarcity of fruit, scarcity of knowledge, scarcity of science. However, a dynamic that may apply to fruit, might actually be inverse for science/knowledge. Knowledge begets knowledge. It's a tree that continually branches ad infinitum. 1000 years ago, medicine was a shrub, whereas now it's a massive tree the size of a jungle (sorry for the sloppy metaphors). An invention such as CRISPR or MRNA technologies, opens up new frontiers for tens? hundreds? thousands? important inventions.

Taking educational psychology as an example - Piaget may have made quite important discoveries, but the "science" or the "knowledge tree" of educational psychology is still a shrub. Perhaps it will be a jungle in a few hundred years and everything we think and do vis-a-vis development/education will be different. If important discoveries in educational psychology are made, they are not decreasing the discoveries to be made, but organically? increasing them.

Expand full comment

An excellent point. If this branching tree metaphor is better, and I think it is, we might expect some branches to crap out earlier than others (so we get all the breakthroughs and know everything about the field in that branch) but there should be ever more questions to answer, and more discoveries to be made.

If I were to try and tie it back to Scott's metaphor, advancing technology should allow for more efficient searching over time. You build roads (someone mentioned this above) you get horses, you stop going towards places that just don't seem to have more food. This should allow you to access increasingly more food with even a linear progression in how fast you can cover distance because the area of the circle gets bigger.

Of course, Scott's model doesn't include things like "Keep going over the same barren, dead end ground because someone wants to pay you to cover that ground, because they want that ground to be true." Sort of the normative sociology that economists make fun of: the study of what should be causing society's ills. Most sciences that even tangentially relate to public policy start to fall into this trap.

Expand full comment

To apply Scott's metaphor, scientific discovery is really all about the mushroom caves, not the flat ground. Random events or quirky observations open up whole new approaches to understanding the universe, which then quickly get fully explored by more workmanlike research. This is basically Kuhn's model, and I'm not sure why Scott tends to ignore it--especially since it helps explain why the enormous edifice of the professional scientific community, which is primarily concerned with picking over known ground, produces fewer breakthroughs than its size would predict under Scott's "foraging" model.

Expand full comment

I only read about the first third of Kuhn's big book about Paradigm Shifts, but I came away with the impression that, to my surprise, its celebrity is deserved.

Expand full comment

These metaphor extensions are fine, but the core virtue of Scott's metaphor (scientific progress ~ picking reachable fruit) is that it's very simple and yet generally fits the data on how our scientific knowledge grows. Or are you suggesting that there is some systematic data that it fails to predict well?

Expand full comment

Yes, it fails to predict the actual pattern of scientific advancement, which more closely resembles a series of random bursts of varying size, depth and speed than a slow, steady march.

Expand full comment
Apr 2, 2022·edited Apr 2, 2022

I think there's an underappreciated aspect here which is that lots of advances may be happening but it gets harder and harder for Scott (and other laymen) to appreciate them.

My subfield was effectively inaccessible until the last century, and my impression is that most scientific fields are similar. No one is asking questions that were conceivable to ask 100 years ago, maybe even 50 years ago. In the metaphor, we set up new cities close to the frontier, and get better and better railways over time. By the end of grad school, you've mastered a multitude of ideas that are pretty new and unexplored, because once someone figures something out it they can teach others.

So this model feels wrong from inside. I'm not constantly walking over already-known things, I learned the modern stuff in school and now everywhere I look is un-tread territory.

But that's a subfield. If you look at math as a whole, there was more big re-defining work in the 1800s than the last 100 years. So to me, things are getting shaken up and expanded all the time, but if you lack the expertise to understand individual subfields, it's hard to explain what new things are happening.

It's unclear to me if the laymen are too far away to see the truth, or the experts are too close to see the truth.

It's often said, and seems objectively true, that far more math has been invented in the last 50 years than the rest of human history put together. One reason for that is there are massively more mathematicians in the world now, for many reasons. So something is happening, the question is how to value it and how to value the people who do it.

Expand full comment

I don't recall hearing about Alan Turing and John Nash when I was young, but today, Turing and Nash are the heroes of Academy Award nominated movies. So, fame progresses. There are probably brilliant people born in the 1990s who are little known to the general public today, but who will be folk heroes in the second half of this century.

Expand full comment

I think some of it is the distance between novel science and practical application, which has a large time lag and a large component of luck. Maths in particular seems to have a habit of going from obscure trivia to pivotal bedrock hundreds of years after discovery, which of course if good for the fame of long-dead mathematicians and terrible for the fame of currently active ones.

Expand full comment

E.g., in 1938, Claude Shannon pointed out that George Boole's binary algebra would work ideally in electronic thinking machines.

Expand full comment

Scott addresses this in number 5.

Expand full comment

Came here to recommend Ars Longa, saw you already did it.

Expand full comment

Plus, some of the early geniuses may well be “names associated with” rather than “sole inventor of.”

For example, despite improvements in the history of science, I bet there were still some husband and wife teams where only his name is remembered (at least in popular culture).

Or Darwin: clearly his ideas grew out of the shoulders of the giants upon whom he was standing, that’s why other people were able to come up with them as well. But we don’t remember the names of those other guys. Similarly for Newton/Leibniz: sometimes the genius halo grows more out of our desire to have a single historical hook on which to hang our story of scientific advances, rather than a deep understanding of the science process.

And if our perception of past genius is distorted by the lens of history, then our comparisons with current geniuses will be less accurate.

Expand full comment

"If I have seen further it is because I have crawled to the top of this mound of dead dwarves" doesn't have quite the same ring to it.

Expand full comment

Peter Higgs of Biggs boson fame.was one of about six co auth ors.

Expand full comment

And the mechanism in question was actually discovered by Anderson several years before Higgs. Higgs himself credited Anderson for the mechanism. But (a) Anderson already had a Nobel prize and (b) he had made himself persona non grata among high energy physicists over the Superconducting Supercollider affair, and no way they were going to let `their' prize go to Anderson...

Expand full comment

of which, IIRC 3 shared in the prize? at least one was dead by the time it was awarded, which is not that surprising when there's 60-odd years between the prediction and the experimental confirmation.

Expand full comment

This model seems a bit oversimplified in two important ways.

1. Ideas don't really "deplete" like this. Say you come up with some good ideas around factoring prime numbers. Someone else invents the computer. A third person puts them together and gets RSA. All three of those are good valuable work, but I wouldn't think third idea was "further out" than the first (in terms of how long it would take to get there). It was just gated on the computer.

Lots of ideas are like this -- simple, but dormant until the other necessary ingredients are ready.

2. The campsite "moves" over time. A whole lot of our cognitive technology is encoded deep in our language, tools, norms, etc., and isn't fixed year over year. Even if today's people and yesterday's people could travel the same distance on average, today's people would still be biased to discovering new things -- just by virtue of starting off somewhere else.

Some of this technology is more literal: computers are something like a bicycle in this metaphor. The early astronomers were analyzing data by hand!

Expand full comment

Point 2 is pretty important. Tools develop. But not all of these are linear: needs develop, too, changing what tools we _think of_.

Analogy, we tame pack animals and horses, and carry more and further. Eventually we develop agriculture. We grow big trees, but now to get more fruit, we need to go higher... going further is not an option.

So someone develops a ladder. But someone could have developed a ladder at any time, if they had spent time solving the problem of climbing higher rather than solving the problem of walking further and carrying more. Differing needs can give us insight into spaces that were always there.

Expand full comment
Apr 1, 2022·edited Apr 1, 2022

Machine learning is definitely still one of the low hanging fruit areas. In this case, you can turn a drug discovery ML system into one that can discover VX nerve gas and a whole new, exciting range of chemical weapons just by inverting the utility function....

https://www.nature.com/articles/s42256-022-00465-9

Expand full comment

Worth noting that the objective function for a machine learning algorithm that finds nerve gas is not in any sense the inverse of one that finds a drug. Nerve gases ARE drugs, from the perspective of a machine learning algorithm. Just drugs with specific endpoints, a high premium on low-dose requirement, and a different set of example drugs to work from.

Expand full comment

(Forgive me if this point has been made already, I'm writing this comment quickly)

I've been thinking about this a bit recently because I'm trying to write a piece about a related topic (and I listened to this interesting BBC documentary https://www.bbc.co.uk/programmes/m0015v9g on the slowing of science and progress). There's another mechanism which you don't model here: in the foraging model, finding fruit only makes it harder to find new fruit. But in science and tech, a discovery or invention makes future discoveries or inventions easier.

For instance, a wood-handled flint axe is a combination of two earlier inventions, the stick and the hand-axe. Newton's observations about gravity are possible because of the earlier invention of the telescope. The invention of the iPhone 13 is possible because of the earlier invention of the [various things, transistors, touch screens, etc].

So there's a countervailing force: individual discoveries become harder *given a base of zero knowledge*, but there are also new discoveries that become possible because they are simply combinations of earlier discoveries (or new technologies make them more accessible).

In your model it might be more like you're loggers, rather than foragers, and cutting down some trees allows access to new trees, but somewhat further off? I don't know what the equivalent of height might be, but perhaps strength.

Expand full comment

I think you don't distinguish inventions (technology) from discoveries (science) enough. The examples you give are mostly technology, where indeed depletion is less evident, because the fact that new inventions must be more complex (the simpler ones are done) is compensated by the fact that new invention are often a combination of existing elements, and progress make more/new elements available to combine.

For discoveries, it's not exactly the same. Progress is usually a way to make new observations available, but that's only part of scientific discovery: it helps discriminate between competing theories or show that an existing theory is not adequate and maybe hint at possible evolutions/replacements. Progress is like having new ways to reach the frontier faster, but which you also have to learn: having a car will sure makes you faster, but you also spend time learning to drive.

So I think there is indeed a lowing fruit/foraging ground exhaustion effect, especially visible in non-combinatorial fields (like base tech (energy prodcuction for example) or fundamental science (physics is a prime example)

Expand full comment

oh I definitely think the low-hanging fruit phenomenon *exists*. But I think there's a countervailing force, of previous discoveries/inventions making new ones possible. I didn't distinguish between the two very much because I don't think the distinction is hugely important - Maxwell can't come up with the equations describing electromagnetism without [consults Wikipedia] Ampere's work on electrodynamics and a hundred other people. (Newton's "if I have seen further it is by standing on the shoulders of giants" quote seems relevant here.)

The empirical question I guess is *how much* the "shoulder of giants" effect counteracts the "low-hanging fruit" effect. My instinct is that it should still get harder, but I know that some people (dunno how to add inline links so apologies for the massive URL: https://deliverypdf.ssrn.com/delivery.php?ID=169000029071117028010108118107125117021004027015095011118103084127066112069079092092012120100121011044030127112082120083080119104074001083064089014029010122030115064002035016104004023010080092021005026090000093022067071027028110002121092079095127106123&EXT=pdf&INDEX=TRUE) think that the apparent slowdown is more to do with bad incentives in science, bureaucratisation, a need to progress in institutions rather than do good science, etc.

Expand full comment

There's a theory that new science is apt to be generated by new tools, rather than thinking more about data you've already got.

Expand full comment

I think you're right, Tom. I've described a similar model based on mining outwards, where progress exposes new rock here: https://www.lesswrong.com/posts/WKGKANcpGMi7E2dSp/science-is-mining-not-foraging

Expand full comment

> Let’s add intelligence to this model. Imagine there are fruit trees scattered around, and especially tall people can pick fruits that shorter people can’t reach. If you are the first person ever to be seven feet tall, then even if the usual foraging horizon is very far from camp, you can forage very close to camp, picking the seven-foot-high-up fruits that no previous forager could get. So there are actually many different horizons: a distant horizon for ordinary-height people, a nearer horizon for tallish people, and a horizon so close as to be almost irrelevant for giants.

Doesn't help that there used to be [a tribe with lots of seven-foot-tall people](https://slatestarcodex.com/2017/05/26/the-atomic-bomb-considered-as-hungarian-high-school-science-fair-project/) but [it has since been mostly exterminated](https://en.m.wikipedia.org/wiki/The_Holocaust).

Expand full comment

I have a notion that that it's easier to think deeply when you're sure your neighbors won't kill you.

Expand full comment

Well, there's the famous observation that war and strife created da Vinci and Michelangelo, while hundreds of years of peace in Switzerland could only create the Swiss clock

Expand full comment

It's from a movie, and I'm not sure it generalizes.

I know a mathematician who believes that military training keeps people from becoming mathematicians, but I'm not sure whether there's solid evidence.

How much personal risk were da Vinci and Michelangelo at? Does it matter whether it's your neighbors or an enemy force?

Expand full comment

That's an interesting viewpoint, and I realize I hadn't analyzed my own viewpoint before.

1. Hunter-gatherers were surely always at war and at great personal risk. They didn't have any markable scientific progress for millennia. This supports your point

2. However, the Second World War can be said to be directly responsible for the advent of the computer, atomic bomb, radar, etc.

Is it possible that if hunter-gatherers only had wars every 5 years or so, and not every week, they would put in a lot of resources into developing new weapons, thus heralding the technological revolution at an earlier date? Is it also possible that if the Second World War had been much shorter, say 1 year instead of 6, a lot of the present-day technology would never have been developed? It seems to me that for technological progress, we need urgency in the form of war, but also slack in which we can play around with crazy and unknown ideas to develop the best inventions/weapons.

Expand full comment
Apr 1, 2022·edited Apr 1, 2022

There's also the question of *refinement*; you can be a Great Genius of your period, but if the technology isn't up to it, then what data you can gather and what experiments you can perform and what working devices you can create are limited.

The Michelson-Morley experiment was important because it disproved the theory of the ether and in turn kicked off research that would eventually develop into special relativity. But there hadn't been sufficiently precise instrumentation to do such an experiment before then; same with measuring the speed of light, etc.

Your hunter-gatherers can be smart and innovative, but there is only so much they can do with their first tools, which have to be worked on to produce better tools, so that more ore can be mined and smelted, and smelting process itself improved, so eventually you can manufacture steel and then you're going places with what you can make, how precise it can be, and how flexible and useful.

The purported lack of genius today may be down in part to something as simple as "we're still working with bronze implements, we haven't even got to steel yet".

Expand full comment

There may be a significant emotional difference between being attacked by enemies and genocide by your own (or nearly your own) government. In war, you have enemies. In a genocide, you can't trust your neighbors.

One thing that needs to be explained is that, when Nazi Germany fell, Jews still existed. A lot of the geniuses from the 40s still existed. Maybe it was a specific sort of of schooling that went away-- institutions can be smashed.

It's possible that it's not that the level of accomplishment has dropped, it's just that the publicity machine for calling geniuses isn't working as well.

Or it's possible that a there's a level of trauma which needs to fade.

On the art side, I suspect that a lot of creativity is going into gaming. Someone could be an amazing dungeon master, but their art is personal and ephemeral and they aren't going to be picked out as a genius.

Video editing is a new art form, and it can be popular, but no one takes it seriously the way older art forms are taken.

Expand full comment
Apr 1, 2022·edited Apr 1, 2022

Leonardo did work as a military engineer and architect for several patrons. We think of him mostly as an artist, but he would have been expected to - and was very much capable of - turn his hand to anything. From Wikipedia:

"Leonardo went to offer his services to Duke of Milan Ludovico Sforza. Leonardo wrote Sforza a letter which described the diverse things that he could achieve in the fields of engineering and weapon design, and mentioned that he could paint. ...When Ludovico Sforza was overthrown by France in 1500, Leonardo fled Milan for Venice, accompanied by his assistant Salaì and friend, the mathematician Luca Pacioli. In Venice, Leonardo was employed as a military architect and engineer, devising methods to defend the city from naval attack. In Cesena in 1502, Leonardo entered the service of Cesare Borgia, the son of Pope Alexander VI, acting as a military architect and engineer and travelling throughout Italy with his patron. Leonardo created a map of Cesare Borgia's stronghold, a town plan of Imola in order to win his patronage. Upon seeing it, Cesare hired Leonardo as his chief military engineer and architect. ...In 1512, Leonardo was working on plans for an equestrian monument for Gian Giacomo Trivulzio, but this was prevented by an invasion of a confederation of Swiss, Spanish and Venetian forces, which drove the French from Milan. Leonardo stayed in the city, spending several months in 1513 at the Medici's Vaprio d'Adda villa."

So there was an amount of risk in his life.

Expand full comment

Orson Welles' famous comment on the Swiss could be quantitatively tested.

My impression is that the Swiss are reasonably accomplished, but that, Switzerland lacking huge cities, they typically accomplish their peaks in other countries. E.g., Rousseau became the most famous Parisian intellectual of the second half of the 18th Century, Einstein created the theory of general relativity in Berlin, Euler and some of the Bernoullis spending years in St. Petersburg.

It's a little like West Virginia: West Virginia is lacking in celebrities, but California was full of West Virginian heroes like Chuck Yeager and Jerry West.

Expand full comment

Whistler and Welles were talking about art. You may hate the art of the cuckoo clock, but you must respect its engineering.

Expand full comment

Another interesting read, thanks! :)

On a single small point : "Since a rational forager would never choose the latter, I assume there’s some law that governs how depleted terrain would be in this scenario, which I’m violating. I can’t immediately figure out how to calculate it, so let’s just assume some foragers aren’t rational".

Isn't there a question of personal preferences and aptitude? Sure, it'd be more productive to go over there but I happen to really like it here and foraging that particular ground makes me feel competent while going over there is arduous for me.

Hence even if it would be more 'rational', I'm not going to do it. 'Irrational' is an acceptable descriptor for that behaviour in economics, but it may not be quite 'irrational' in everyday parlance, it's just optimizing for different objectives.

Expand full comment
Apr 1, 2022·edited Apr 1, 2022

Let me give an epistemic reason for the stall. There’s a clear barrier to recent progress of the traditional kind, which is (to use the jargon of my colleagues in Santa Fe Institute) complexity.

Complex systems are not amenable to the Francis Bacon style “vary and test” experimental method. We’re learning a huge amount but returns to experimental methods of the causal-control kind are hard to come by. Taleb is a good example of a person — a real, no-BS practitioner — who recognized many of the same things the SFI people did. In a funny way, so was David Graeber.

Examples of complex systems include the human mind and body; hence why we’ve had so little progress in getting control of (say) depression or cancer, and why the human genome project fizzled after we found the 1-SNP diseases. Much of Econ is similar (IMO the RCT era is overblown). ML is CS discovering the same.

They’re hard problems that will require a new set of tools, and even a new Francis Bacon. The good news is that I think we will crack them. We stumbled on this world in the mid-1980s, but IMO didn’t get serious until the mid-2000s.

Expand full comment

Agree with everything you say here for whatever it’s worth. We need a new kind of flashlight to see ahead and specifically a flashlight for complex systems.

Expand full comment

Didn't investigation of complex systems at least get started when computers reached the point of being able to somewhat handle chaos?

Expand full comment

What about education? Think the day away/teleport thing might break on this one. The tribe writes down descriptions of very distant places in very exacting detail and if a student spends ten years studying it they can get there instantly vs say two hundred years if they tried to go it alone. Or do we define the day as what is possible to achieve even with education?

Other interesting thought is artifice. One day the tribe invents a car. I mean that literally in this analogy, although maybe microscope is better. Or stilts or a shovel or something in this analogy? The mere addition of tools that allow you to reach greater depths or heights causes the depleted land to have new bounty. Some of those technologies exist farther away.

I like this a lot overall. I have a similar analogy about light houses I use.

Expand full comment

Thanks for the great write-up.

In some sense, a lot of progress in science can be thought of as "getting closer to the truth" than "finding new terrain". "Getting closer to the truth" comes from "change of perspective". This change of perspective mostly comes from new technology or observations, like the Morley-Michelsen experiments, which gave rise to Relativity, or other experiments that led to Quantum Physics. The age of the scientists is generally irrelevant. Physics was many hundreds of years old when Einstein and Dirac, young scientists, made their discoveries. Although they may in themselves be giants, it is difficult to argue that such giants don't exist at all today in terms of sheer intellect and hard work.

Hence, I feel that point no 5 and confirmation bias can explain a lot of this. People learn a paradigm, and try to stick very hard to it, until new technology makes experiments possible that clearly contradict those paradigms, causing paradigms to change. The first scientists to then discover those changed paradigms that accommodate the new experimental results become heroes.

Expand full comment

Let's revisit this issue when you've got more data about the scientists, so that you can concentrate on that instead of elaborating the already-clear forager metaphor and then shrugging your shoulders over the real question.

Expand full comment
Apr 1, 2022·edited Apr 1, 2022

This is really pessimistic without the last part - that at some point, the foragers manage to set up camp in another part of the forest, acquiring untold riches at first, then letting others set up even further.

This is what happened with machine learning, with biotech (next generation sequencing, anyone?), in fact a lot of science is about this kind of camp-setting. "Standing on the shoulders of giants", and it's giants all the way down/up.

There is a huge difference between having to figure out calculus from first principles, and learning it in high school then moving on to something cooler. And then you can have the computer calculate your integrals for you, with calculus relegated to your "maybe figure it out someday when it's needed" pile. Knowledge is a tool for acquiring further knowledge.

Expand full comment

As I say in the original essay on genius, I think it's true that "ideas are getting harder to find" (what you call the "Low-Hanging Fruit Argument"). It's also empirically supported by looking closely at things like agricultural yields. The question is just whether it fully explains the effect, or even most of the effect, and there are reasons to doubt that. For example, the two reasons I give in the original essay to be skeptical are:

(a) if the lack of genius (or let's just say "new ideas") is due solely to ideas getting harder to find, then it is an incredible coincidence that, as the effective population of the people who could find such ideas exploded to essentially the entire globe (with the advent of the internet and mass education), ideas got harder to find to the exact same degree. In fact, this looks to be impossible, for there should have been "mining" of the idea space in order to quickly exhaust it, and which would have triggered a cultural golden age. It is on this question that the original essay starts, but I've never seen anyone address how changes in effective population should have led to more "finding" and that doesn't look like what we see.

(b) “ideas are getting harder to find” seems especially unconvincing outside the hard sciences in domains like music or fiction. I actually still think there is some truth to it - you can only invent the fantasy genre once, and Tolkien gets most of that credit. But overall it seems obviously true that something like fictional stories aren't as directly "mineable" as thermodynamical equations. And yet, again, we see the same decline in both at the same times, so the explanation needs to extend beyond the hard sciences.

Expand full comment

How much of the entire globe has supporting infrastructure (physical and cultural) that makes these places a viable location for research? If we moved a researcher from Switzerland to Kenya or Bangladesh, how would it affect their output?

Expand full comment

Surely it might, but perhaps less than one might expect - there are some great universities in India! But consider just within the US: it used to be that only the kids at MIT got the MIT lectures. Now *anyone* can get the MIT lectures. It used to be that only the kids at Harvard got access to the best math teachers. Now there's Khan academy. Most scientific papers can be found online, even if you don't have institutional access. There's thriving intellectual communities on blogs and in forums and places to post contributions at zero-cost, if you have them to make. Not to mention the mass education - just look at how many new foragers there should be now that racial and gender barriers have been significantly decreased and college is basically mandatory for a huge swath of Americans! An explosion of foragers, and the ease of foraging massively increased, all within a few decades. And yet, and yet.

Expand full comment
Apr 1, 2022·edited Apr 1, 2022

Khan and MIT lectures are great fallback resources, or adequate resources to get an understanding somewhere between a high school student and a really bad undergrad.

I tried learning molecular biology as an amateur before actually returning to college for it (perks of the free, if substandard, education in my country). Maybe I could pull it off if I was a supergenius, but realistically I'd have an extremely fragmented and ungrounded understanding. So for my first contribution to the field, I'd need to:

- read papers en masse where the only layman-accessible words are prepositions

- understand the context of those papers to know why they're doing what they're doing, and why it's important for the field, and why couldn't it be done in another easier way

- identify limits of the state of the art

- come up with a good idea for a novel contribution, by myself, without asking people who already grapple with these problems

- find collaborators and convince them my idea is good and I know my shit

- write the paper

- get it accepted in a journal, or at least self-publish on biorxiv and make a considerable number of domain experts read it

God forbid I need actual wet experiments and funding - that's just the list for pure in silico work!

At any point I can make mistakes - I don't have any experts to look over my shoulder and tell me when I made a mistake. I'd spend years of my life working on a project that is completely irrelevant and either someone did it better five years ago, or it's just something that nobody does because there is no point to it.

Expand full comment
Apr 2, 2022·edited Apr 2, 2022

>- get it accepted in a journal, or at least self-publish on biorxiv and make a considerable number of domain experts read it

Isn't this exactly the problem caused by greatly increased amount of people involved in science? Too many people publishing so much that nobody can read it all and judge it on its merits. Consequently everyone falls back into social games of knowing the right tastemakers and having access to the right connections. Instead of a democratic/capitalist marketplace of ideas, it becomes an aristocratic society reminiscent of an early Victorian novel. ("Aristocratic" intended as a slur, not its literal meaning.)

A century ago a scientist could have a very good grasp of their field by reading a handful of journals relevant to their discipline. Two centuries ago, you'd do well by reading a single journal (Philosophical Transactions).

Expand full comment
Apr 1, 2022·edited Apr 1, 2022

"An explosion of foragers, and the ease of foraging massively increased, all within a few decades. And yet, and yet."

Okay, I'm gonna bite here. Take one of your geniuses of the past, plonk him down today, and see if he still counts as a genius. Strip away the mythos of Einstein, take his work from youth to maturity, and compare it with people working in the same field today.

Would Mozart be considered a genius? Maybe. Or maybe he would go into composing movie scores which is lucrative and steady work, and nobody would talk about him as a great genius, even if he radically transformed how movie scores are written and used.

The mark of the past genius should be that they could still come up with novel concepts and discoveries even after absorbing all the progress made in the field since their day. But could they? Would Einstein be able to leap ahead with new ideas, or would he have reached his limit at what are now "yes, this is part of the field everyone has to know" ideas?

I do think there are natural human limits. It may be that we are hitting up against them, and that the intellect sufficient to be a revolutionary genius in the low-hanging fruit days is not sufficient to be the same in days of scarcity of fruit. It could well be that in ten years time somebody comes up with something new and unprecedented which is indeed unforaged territory, and the corresponding Great Men will flourish in that field. Giving up on "where are all the geniuses?" right now seems premature to me.

Expand full comment

"there should have been "mining" of the idea space in order to quickly exhaust it, and which would have triggered a cultural golden age."

What makes you think this is not a cultural golden age? How would you define a cultural golden age? As we are constantly reminded, no other society in the history of the planet has been as rich, educated, and living good lives as we are. We have everyday devices that were the stuff of science fiction even thirty years ago, and our poor can access some model of those. We have the Internet, we have access to more information quickly, easily and cheaply than any culture before has ever had. Diseases and health problems that would have been death sentences are now curable with a course of pills or surgeries. Ordinary people have access to resources to be creative that even the great craftsmen and artists of former times could not have dreamed of.

The complaints seem to be along the lines of "where are our colonies on Alpha Centauri?" which, when you think about it, can only be the kind of complaints from a rich, successful, technologically advanced society accustomed to constant progress.

(I'm not saying we are in a cultural golden age, just that golden ages tend to be recognised by looking back to the past and saying 'ah yes, that was a time of wonders'. What will our descendants a hundred years from now think - will they talk about our golden age?)

Expand full comment

I don't think most people credit Tolkien with "inventing" the fantasy genre since most everyone knows it existed before him. It's just that nearly everyone since has been writing in his shadow.

Expand full comment

So far as I know, Tolkien didn't invent the fantasy genre. He invented, or at least popularized, serious world-building, which has become a dominant part of fantasy.

Expand full comment

"you can only invent the fantasy genre once, and Tolkien gets most of that credit"

William Morris would like a quick word:

https://en.wikipedia.org/wiki/The_Well_at_the_World%27s_End

Expand full comment

It depends on what you mean by "the fantasy genre". I believe that fantasy is a human norm for fiction, and if anything was invented, it's the idea that respectable adults shouldn't want fantasy.

And then.... there was Tolkien and Star Wars and the return of the repressed desire for fantasy.

Expand full comment

The low-hanging fruit argument seems to me very probable, and I love the metaphore with real fruits in it!

I would like to add a small (and quite optimistic!) additional hypothesis concerning the decrease of the observed frequency of geniuses, this one in relation with the increase of the population and its level of education.

If we assume that we recognize someone as a genius when he or she clearly surpasses all the other people in his or her field, that it is therefore mainly an evaluation that is relative, being done by comparison with what other people produce at a given moment in the field in question. In this case, the fact that the population as well as its level of education is increasing must also very significantly increase the number of people working in a field . And in this case, it seems to me that statistically, the probability that the most talented person in a field is much more talented than the second most talented person in the same field, is probably much lower than before.

Therefore, we would have difficulty recognizing contemporary geniuses partly because there would be many people doing extraordinary things in general, whereas before there were a few who stood out.

Expand full comment
Apr 1, 2022·edited Apr 1, 2022

I like the rough model but i'd point out that there are certain topological assumptions being made, which maybe don't apply. If 'places of insight' were arranged in some Euclidean geometry, then your theory holds.

But if we generalize it to "finding new knowledge require _either_ walking new ground, or exceptional talent" (which i think is totally fair), we might ask whether it's possible to walk new ground via nontraditional approaches. If the _only_ dimension we consider is 'angle and distance from the base camp', i.e. the territory is 2-d euclidean grid, and we've mapped out where everyone has and hasn't walked, then it becomes much less likely you will _find_ new ground immediately around the camp.

But if the number of dimensions is so high that most people don't even _see_ a bunch of dimensions, then we might actually expect _creativity_ to lead to insights more readily than intelligence.

Or, if technology+economics have changed in such a way that someone might have 10 different mini-careers and still acquire sufficient wealth to do as they please , this might _also_ be 'new territory' where discoveries become easy. So we might expect future discoveries to be more likely from, say, a startup employee turned venture capitalist turned armature horticulturalist turned poet turned botanist, who synthesized a bunch of experiences that many other people had, _individually_, and yet nobody had yet had _collectively_.

The fruit-gathering analogy might work if someone is the first person to circumnavigate the camp at a specific radius, and to spend at least a few weeks at different angles at different times of the year. They might notice some seasonal continuity between plants growing only at that radius, which might only be observable to someone who had spent the right amount of time in all of those places. In terms of ground, they haven't covered anything new. But if we include time in there, then yes, it's like they _did_ walk on new territory.

So i like the theory if we generalize it as "to maximize your chance of discoveries you have to walk on ground nobody else has walked on before", but it's worth asking whether "the space of being an academic researcher" being extremely well-trodden means that there aren't low hanging fruit in dimensions none of us have even considered looking in.

Like, for all we know, just breathing weird for like 7 years straight could let you levitate and walk through walls. How would we know if this were true? Suppose someone discovered it 10,000 years ago, and they did it, and everyone was like 'holy shit that's crazy' and they wrote stories about it, and today we dismiss those because they are obviously absurd. Are _you_ willing to spend seven years chanting some mantra on the off chance that maybe it'll let you walk through walls? I'm not. Probably most reasonable people aren't. That's some unexplored territory right there! But something tells me it probably isn't' worth the effort.

And yet people like wim hof exist. This tells me there's probably a ton of low hanging fruit still around but it'll be discovered by eccentric weirdos.

Expand full comment

Is there any reason to assume science fruit space is even Euclidean?

As players of https://zenorogue.itch.io/hyperrogue know, in hyperbolic space you can fit huge worlds so close that you'll get from anywhere to anywhere else in a few steps. The problem is just knowing the way.

Expand full comment

I'm not sure about "taking more time to reach the frontiers of knowledge". Bachelor's degrees haven't gotten steadily longer over time, and previous key discoveries get built into the curriculum. The length of postdocs has (particularly for those eyeing an academic career), but that has more to do with the often enormous quantity of work required to get your Nature Something "golden ticket" paper. Once you start grad school you're basically teleported to the frontier. People learn and adapt quickly.

I think genuine breakthroughs happen on a more regular basis than people think, but we've pushed the depths of knowledge so deep that they're not necessarily recognizable to an outside observer.

Expand full comment

I’m not sure about other fields, but I can say that it definitely takes another 2-3 years to hit the frontiers after a BA. This might vary depending on your subfield, but in analysis it definitely can

Expand full comment

I really enjoy analogies so thank you for writing up this very thoughtful and entertaining model. I think there's another thing at play, which is distraction. I'm not as talented a writer as you, so instead of clumsily trying to extend the analogy, I'll tell a some stories about my own medical school class.

I went to a well regarded medical school with lots of brilliant and talented classmates. I will say there was a big difference in how much each of my classmates were excited by discovery, and interest in discovery was largely orthogonal to pure intellectual horsepower. Some of the smartest people I've ever met had exactly zero interest in research--they jumped through the appropriate hoops to get residencies and fellowships in lucrative fields and now enjoy lives where they are affluent, enjoy high social status, and do work that they find interesting enough. I think some of these folks are "potential geniuses" who made a rational choice to take a sure thing (a career as an orthopedic surgeon), over something more volatile (a career doing research in the life sciences).

To give an example of the same effect, working slightly differently, a friend of mine told me that he had taken a job as an investment banker right after college, and then was laid off before he could start working due to the financial crisis. He came to medical school as a backup plan, and is now an extremely talented epidemiologist.

Final story is about a friend who, while he was a post-doc (MD PhD), realized it made much more sense to moonlight as a doctor and pay other post-docs (who were PhDs and didn't have the more lucrative option of taking care of patients) to execute his experiments for him. This was kind of a boot-strappy way of leveraging the resources around him. But I tell this story is because he had to make science more a passion project funded by his actual lucrative career which was as a physician.

What I take away from these stories is three things:

1. It doesn't really make a lot of sense to study the sciences (especially at a fundamental, basic level that is most likely to create groundbreaking discoveries) if what you care most about is a comfortable or happy life. True, the rewards are enormous for the right-most outliers, but most people work very hard for tiny material rewards, when they're usually clever enough that they could have nicer lives in most other careers.

2. Having a successful career as a scientist is HIGHLY path dependent. You have to have the right sequence of experiences that give you more and more momentum, while also not having experiences that pull you off your scientist path onto more lucrative or comfortable paths. This is a point that's been made MANY times before but I wonder how many potentially great thinkers over the last 30 years have pursued careers in management consulting, banking, or dermatology, or orthopedic surgery. Obviously these people still make potentially great contributions to society in the roles that they take but they are much less likely to expand the frontiers of human knowledge.

3. We still probably undervalue most research, as a society. Because the potential payoff is so uncertain, individuals have to bear a lot of the risk of these careers. There's an enormous opportunity cost to getting a PhD and doing a post doc, and even if you are one of the few successes that gets your own lab, it's still a not a very materially rewarding situation. So what you end up with is a) a lot of talented people who bail on their science careers for things that are more of a sure thing and b) a lot of people who never consider a science career because it represents a high-risk, low-reward scenario compared with the other options in front of them.

Expand full comment

For the sake of argument, let's grant that your argument as presented is 100% correct. Even so, outsized focus on the political aspect is right and proper because unlike the mechanical causes we have some small hope of changing the politics. Instead of "there's no mystery to explain here" the takeaway could be "we need to run a tighter ship of Science, the deck's stacked against us".

Expand full comment

Perhaps a more apt analogy for science is not picking fruit, but planting fruit trees. Planting a fruit tree suggests a scarce return in the short term, but the returns can expand organically in two ways: as the tree grows, and as the seeds from the tree spread to sprout other trees. So, a single planted tree has the potential spawn an entire ecosystem. Similarly, knowledge begets knowledge.

Expand full comment

"machine learning should have a lower age of great discoveries."

Possibly controversial opinion but machine learning is a technological field, and not a scientific one... or rather -- none of the science is novel. The advances in machine learning are a combination of the scale afforded by modern hardware, vast amounts of data, and statistical and curve-fitting theories that have been around forever. The big issue with regarding it as a scientific field (for me) is that they aren't coming up with new principles as such, they're coming up with a set of techniques to accomplish tasks. And in general they have no idea how these techniques actually accomplish these tasks -- the loop is generally suck-it-and-see; hence all the pseudoscience surrounding it and baseless claims that brains work like neural nets, or that sexual-reproduction works like drop-out, and so on.

Another factor is that to make a discovery in machine learning, you need to spend a lot of money on compute, and a lot of money on data (or have an agreement with some company that already has tonnes of it) -- so this also favours established people.

Finally, advances in machine learning are consistently overstated. GPT-3 already absorbs more content than any human has ever absorbed; and people are amazed that it can muddle through tasks that are simple for a human child with a fraction of the compute, or training data. Also, there's a bit of Emperor's Clothes about this stuff. One of the useful things about human cognition is that you can tell a human "hey, there's this interesting thing X" and the human can quickly assimilate that into their model and use it. For example, I can give you a slightly better method for multiplying numbers, and you can apply it pretty instantly. This is what "learning" usually means for human cognition. You can't explain to GPT-3 a better method of multiplying numbers. And there's no mechanisms on the drawing board for how to do it. Sorry this is a bit of a rant, but in my real life I'm surrounded by people who think GPT-3 is basically a human brain and it drives me nuts.

I think you need a different model for science and technology? As you say, physics seems to have stagnated, but our technology continues to advance; cosmology continues to advance, but space faring technology regresses; scientific knowledge about birth advances, but outcomes of birth in terms of morbidity and cost declines. For software and engineering, the science for which continues to advance, but the technology declines (see Collapse of Civilization by Jonathan Blow).

Expand full comment

The area available to forage, is (pi R squared).

I'm pondering if this is related to scientific discoveries too. Since geography varies, and science fields vary also, I think there's some merit here. One forager my specialize on muddy seeps, whilst another may focus on the banks of larger rivers, another robs the nests of cliff dwelling birds. Each would find different resources, one comes back with a fish, the other with cattail roots & duck eggs, another with swallow eggs and nestlings. Likewise in science, someone plays at melting things in the furnace, someone plays with light and lenses, another ponders infinite series.

Expand full comment

"Some writers attribute the decline in amateur scientists to an increasingly credentialist establishment"

I suspect that one reason for the credentialist establishment is that it takes many years to reach the state of the art in knowledge, and non-rich people can't afford to spend that many years studying rather than working. The longer it takes to reach state of the art, the more money has to be spent getting that student to the state of the art, and the greater the need for a bureaucracy to decide who gets it and who doesn't - and bureaucracies run off credentials.

One reason I think that the UK is overrepresented in scientific research is that our education specialises earlier than most other countries, which means that, at the expense of a broader education, Brits can reach the state of the art several years earlier than Americans (the average age at PhD is 29 vs 33).

Expand full comment

If true, this is an excellent argument for letting gifted kids specialize earlier, while the whole educational community is pushing for a longer period of general education. There are other considerations here - maybe children with a more general education are more likely to lead happy lives and it's worth sacrificing a few potential geniuses to the gods of mediocrity to make that happen.

But if so that just brings us back to Hoel and the idea that an education that is personalized and one-on-one is just vastly superior to our system at cranking out revolutionary thinkers.

I guess if you find yourself burdened with a precocious progency the strategy is get 'em young, find someone who can cultivate their strengths, and try to keep the truancy officer away long enough that they aren't forced to spend 6 hours a day proving they're reading books they already read.

Expand full comment

Mostly off-topic but fun and kind of instructive game: Imagine what a modern education looks like for geniuses of the past. What was Oscar Wilde's mandatory elective? What does a paper about the Themes of the Scarlet Letter with an introduction paragraph, at least three body paragraphs, and a conclusion paragraph look like if written by Newton or Einstein?

Expand full comment
Apr 1, 2022·edited Apr 1, 2022

"What was Oscar Wilde's mandatory elective?"

Classics, if this Wikipedia article is correct (I read some anecdote years ago about one of his Trinity tutors saying, about Wilde's success in England, "Yes, it was better for Oscar to go there, he wasn't quite up to the mark here"):

"Until he was nine, Wilde was educated at home, where a French nursemaid and a German governess taught him their languages. He joined his brother Willie at Portora Royal School in Enniskillen, County Fermanagh, which he attended from 1864 to 1871....He excelled academically, particularly in the subject of Classics, in which he ranked fourth in the school in 1869. His aptitude for giving oral translations of Greek and Latin texts won him multiple prizes, including the Carpenter Prize for Greek Testament. He was one of only three students at Portora to win a Royal School scholarship to Trinity in 1871.

Wilde left Portora with a royal scholarship to read classics at Trinity College Dublin, from 1871 to 1874, sharing rooms with his older brother Willie Wilde. Trinity, one of the leading classical schools, placed him with scholars such as R. Y. Tyrell, Arthur Palmer, Edward Dowden and his tutor, Professor J. P. Mahaffy, who inspired his interest in Greek literature.

...At Trinity, Wilde established himself as an outstanding student: he came first in his class in his first year, won a scholarship by competitive examination in his second and, in his finals, won the Berkeley Gold Medal in Greek, the University's highest academic award. He was encouraged to compete for a demyship (a half-scholarship worth £95 (£9,000 today) per year) to Magdalen College, Oxford – which he won easily.

At Magdalen, he read Greats from 1874 to 1878, and from there he applied to join the Oxford Union, but failed to be elected.

While at Magdalen College, Wilde became particularly well known for his role in the aesthetic and decadent movements. He wore his hair long, openly scorned "manly" sports though he occasionally boxed, and he decorated his rooms with peacock feathers, lilies, sunflowers, blue china and other objets d'art. ...Wilde was once physically attacked by a group of four fellow students, and dealt with them single-handedly, surprising critics.

...In November 1878, he graduated with a double first in his B.A. of Classical Moderations and Literae Humaniores (Greats). Wilde wrote to a friend, "The dons are 'astonied' beyond words – the Bad Boy doing so well in the end!"

Expand full comment

Classics are a good choice for a writer/artist - My high school offered band, orchestra, drama, show choir, and home ec. I was not an exceptional student and it wouldn't have occurred to me to ask for a classics elective, but if some incredibly talented young person had, I suspect the folks there would have needed to look it up.

Expand full comment

"the strategy is get 'em young, find someone who can cultivate their strengths"

How young are we talking, and how specialised? Suppose little Johnny is good at maths, so you identify that as where he has the potential to excel. So you steer him along a path leading more and more to specialisation in maths, and prune away any extraneous subjects. And sure, he ends up excelling in maths - but there's an amazing breakthrough in biology he could have made, except all that was pruned away early in favour of keeping him on the maths track.

The Polgar sisters are chess prodigies, but could they have been doctors, musicians, engineers? We don't know and are unlikely to ever know, because while I don't think their father isolated them with chess alone, that was where the positive reinforcement came in. Doing well at something else was praised, but doing well in chess was where the most attention and most celebration and most reinforcement happened.

What way would they have turned out with a more general education? Would they have been prodigies in a different area, if left to natural inclinations? That's not a question we can answer, but I do think it needs to be asked when we're talking about steering kids to specialise in one topic over another.

Expand full comment

Entirely agree - I'm currently expecting and know I am not the sort of person who will be able to steer my own child into doing one thing their whole lives without giving them input. I *do* think this means that I will not raise a "person who gets to be in the history books" level genius, but most "person who gets to be in the history books" level geniuses I read about turn out to have miserable personal lives and feel bad about themselves forever. And an unusual number of them turn out to have serious issues with their fathers so...

Expand full comment

> I can’t immediately figure out how to calculate it, so let’s just assume some foragers aren’t rational.

I can't resist that remark. Nerd snipe succesful. First, short answer: After 9 hours of travel, you only have half the time to forage, so you should be able to gain twice as many points per time unit to compensate for that. So if the area at 6 hours distance is 50% depleted, the area at 9 hours walking distance should be 100% virgin area to get the same expected total value. Only after the depletion level in all areas within 9 hours walking distance increases does traveling further become worthwhile.

Compare with the early explorers: Nobody will travel when the area at distance 0 has full value; it is only after the depletion level at close distance starts to become noticeable that people will decide to venture out (and even then, they'll travel as little as possible if they want to maximize their gain).

More general computation. Assuming we are in a state of equilibrium, let D(x) be the depletion level at x hours from camp. Then after walking x hours and gathering for 12 - x hours, you gain (12-x)*100*(1-D(x)) points. In a state of equilibrium, this should be constant, so (12-x)*(1-D(x)) is constant, say C. Then 1-D(x) = C/(12-x), i.e. D(x) = 1 - C/(12-x). Given the assumption that D(6) = 0.5 (you need to make an assumption somewhere), you find C = 3 and hence 1 - D(9) = 1. When traveling further, you'll find that D(x) becomes negative, i.e. the area should have expected value more than 100 points per hour to become worth traveling to. In a model where D(x) must be between 0 and 1, you'll find that the depletion level will gradually decrease as you travel further until you reach D(x) = 0, at which point exploring further does not gain you anything. Note: D(x) = 0 when x = 12 - C. You can measure C by checking the depletion level at any point where people do forage; for example, if the area at 0 hours distance is 95% depleted, then 1-0.95 = C/12, so C = 0.6, and people will travel as far as 12 - C = 11.4 hours to forage for 0.6 hours; 0.6*100 = 5*12 points. Chances are that far before this point it'll become valuable to invest in ways to travel further.

Expand full comment

Wouldn't the foragers just move camp? If it takes nine hours to get from present campsite to new virgin area, isn't it simpler to pack up and move the entire camp closer, rather than spend the majority of time travelling to and from the new site, with less time to forage?

Agriculture would be different, as you are rather tied to one area (you can't just pack up an entire farm). Even there, it's easier to drive out your animals to graze in a good area and then drive them back in the evening to be milked, and if it starts to take too much time to travel to and from the pastures, you do things like let the sheep roam free on the mountain and only bring them down at certain times of year (a bit tougher with cattle, but transhumance is definitely something: https://en.wikipedia.org/wiki/Transhumance).

I'm not entirely sure where I'm going with this, mostly that the analogy breaks down for me here. If your foraging grounds are further and further away, you can easily pack up and move closer. You can't quite do the same with fields of knowledge, because you do have to traverse the already-covered ground to give you the background you need before you can get to the new, untapped area.

Expand full comment

Oh, absolutely, in my last line I was mostly sticking to the existing analogue that I think Scott intended, in which I interpret the twelve hours roughly as the human lifespan, traveling as time spent learning the basics to be able to understand an unknown field of knowledge, and foraging as actually doing research in said field. In that case 'moving the camp' is (probably) not an option (we all start from birth), but finding ways to travel further might be possible (be it improved education (faster walking speed and/or more efficient foraging), intelligence enhancement (same), lifespan extension (more hours to travel), or [insert favorite way to increase the amount you can learn]).

Sticking to the actual foraging story, I agree that moving camp would in most cases be more sensible than investing in maximum travel distance (especially if you actually manage to reach 95% depletion of the nearby area).

Expand full comment

> you can easily pack up and move closer. You can't quite do the same with fields of knowledge, because you do have to traverse the already-covered ground to give you the background you need

The verb in the metaphor should be “assimilate”

not “consume”

perhaps.

Expand full comment

I don't think this adequately explains why people who made great discoveries when they were young in 1900 didn't increase their rate of making great discoveries when they were old and bring the average up. One needs to explain 1900-people losing discover-ability as they age, but 2000-people gaining it.

One unmentioned thing can help explain this: extension of healthspan. The mind is the brain is just an organ in the body and if the body is generally dysfunctional the brain will probably not be in best condition either. Being in great health instead of poor health probably at least dectuples the probability of some great discovery. The age-related cognitive decline curve probably shifted a lot due to the extension of healthspan.

Expand full comment

Plus, treponema pallidum has less chances to wreck the brains of geniuses nowadays.

Expand full comment

"One in five Londoners had syphilis by age 35 in the late 18th century, historians estimate."

Expand full comment

I think there's something foundationally missing from this model. Very specifically - what about cranks and weirdos who were retroactively not cranks and weirdos?

More specifically - all of the computer science greats (Dijkstra, Turing, etc) all did their foundational *mathematical* work well well before they were household names (at least among people who are relatively intelligent and science-aware).

There's a great revolution that happened around 1980 that suddenly made computer programming, computer software, and thus *computer science* and all of its offshoots - massively more high-status and important because the Fast Transistor Microprocessor was starting to allow more and more things to use Software.

Without the Fast Transistor Microprocessor - none of that work would be lauded as the genius at it is (Turing, rather famously - went to prison) and would instead be an esoteric curiosity for mathematicians.

I get the feeling that with the amount of Science Infrastructure we have in place today, absent some New Technology or Paradigm that enables a lot of work that was done previously to matter in a new way, or enables new work - most people seeking truth are going to be happily chipping away in the truth mines for proofs or evidence for esoteric effects that aren't super relevant to today. We will lament their lack of progress in world changing and truth seeking for decades.

Suddenly - something will change, some new technology will get invented, or some new mathematical, computational, or scientific tool will become widely known or available, and suddenly proofs from 1960s mathematicians or earlier are no longer esoteric curiosities - they're the world-shaking foundation of the modern universe.

Expand full comment

I keep thinking about the time period of the Buddha, Jesus, and Mohammed. (I know that's quite a range of time, but in the course of human history, it's not so much.) Was there just a sweet spot around then for religions? Like, there was enough 'ambient philosophy' around that new and compelling religious discoveries could be made? (Although it's not what I actually believe, for this purpose assume by "discovering" I mean to say, there are certain religious ideas that can be dreamt up that enough people will find compelling that they can gain a real foothold. Discovering is finding one of those ideas.)

Expand full comment

Arnold Toynbee thought a lot about this in his “study of history.” He postulated that as civilizations atrophy and then decay, the period of struggle during the long collapse often gives rise to a spiritual upheaval leading to a more enlightened religion. Speaking in broad strokes, Toynbee sees 3 major cycles of civilization (we are in the third), with each time of struggle spawning a more advanced spiritual state (e.g. Baal worship - Judaism - Christianity, with parallels in Asia and India leading to Hinduism and higher Buddhism). This new religion then goes on to define the successor civilization in Toynbee’s model. The time period you are talking about would be for him the struggle period of the second cycle of civilization.

Of course Toynbee is all but cancelled and forgotten these days for his taking religion and spirituality seriously in his historical analysis, and for his sweeping narrative approach which went afoul of postmodern historiography.

If he is right, and if we are in the struggle phase of Western Civilization (debatable), the question is what new spiritual system will arise from the ashes of the West. I feel like that is at least related to your question.

Expand full comment

I have a general conviction that hard polytheism is outcompeted by the alternatives because it doesn't hold up to scrutiny or really offer meaning or answer any important existential questions.

But in your comparison, I'd probably nix Mohammed and think instead about Zoroaster. Historians don't agree when he lived -- close to the reign of Cyrus the Great or 1,000 years before? But if we speculate it's the former, then a certain "Age of the Prophet" can start to be seen, centered on Persia and the lifetime of Cyrus, and probably ending with Mani. Cyrus ended the Jews' Babylonian Exile and began the Persian conquest of modern-day Pakistan, on Buddha's doorstep and near-contemporaneous with Buddha's life. He also came towards the end of the age of the Old Testament prophet, though a handful were post-Exile.

Now, as a Christian I'd argue that the prophet of that age was a sort of God-given Jewish "technology" that spread to Persia and finally India, but non-Christians will generally argue the reverse and that Second Temple Judaism imitated Zoroastrianism. I think this is mostly a faith judgement either way.

Expand full comment
founding

weren't early scientists amateurs because science wasn't a profession you could earn a living in?

Expand full comment

Steven Johnson has a similar concept he explains in his book, Where Good Ideas Come From: The Natural History of Innovation, called "the adjacent possible." His analogy is that every new discovery opens a door into a new room which contains yet more doors. Each new discovery opens paths to new discoveries.

Expand full comment

I have been a bit confused by the premise of this conversation on genius and the perceived implications (concern?) that it seems to be bringing up.

My (oversimplified?) understanding of Hoel's original piece:

1. The world use to product "geniuses" (towering giants in a single field or multi-disciplinarians who made large contributions across many fields). Some of them even made their contributions in their spare time!

2. We don't do this any more

3. This is bad/concerning

4. How can we solve this problem?

5. Aristocratic tutoring?

Isn't this essentially specialization playing out? The reason this doesn't happen anymore is the comparative advantage even for people with the same natural talent as past geniuses is more than overcome by the specialization that is required to make a contribution in nearly all fields. Instead of being a problem, isn't this a natural consequence of all of the efforts of those who came before? As Scott's analogy is pointing out, hasn't all of the low-hanging fruit been picked?

That strikes me as a much simpler answer than a lack of aristocratic tutoring.

Expand full comment

Interesting article. I think one element this fails to take into account is the general category of surprise/accidental discoveries. Like Kuhn's paradigm of scientific revolutions on a small scale.

To put that in terms of your example: What if one day little Jimmy the forager trips and and lands face first on a rock and realizes it's edible. It doesn't matter then if he is experienced, smart, old or young.

Scientific progress is not necessarily linear?

Expand full comment

I think the conceit that knowledge is dimensional is flawed in a number of ways, not least the ways others have already brought up, such as that historical ideas make entirely new ideas possible.

I'll observe that somebody (Cantor) invented set theory. He didn't find a new space in the territory - he created new territory out of nothing.

Expand full comment

I agree, new territories get created from time to time.

Expand full comment

Sounds solid to me. But I'll nitpick against the claim that `physics is stagnant.' This is arguably true for high energy physics, but physics as a whole remains vibrant, largely by virtue of constantly inventing new subfields (which open up new foraging opportunities). See my DSL effortpost on the topic here https://www.datasecretslox.com/index.php/topic,3007.msg91383.html#msg91383

Expand full comment

Per the typology I propose in that effort post, you can divide up physics into six major subfields, of which only one can really be argued to be stagnant. That subfield only accounts for ~10% of professional physicists (according to statistics from the American Physical Society), although it might be more like 99% of `physicists that talk to journalists.'

Expand full comment

LOL! I agree, physics as a whole is not stagnant.

Expand full comment

I have enjoyed Scott's whole collection of posts around research productivity. I want to throw in another ingredient that I think should get more attention.

In most fields, having a research-focused career has gotten dramatically more competitive over the last generation or two. Intense competition can help motivate people to work harder and reach further, but it can also stifle creativity. I'm specifically thinking here about the need to publish and get grants, and how in highly competitive areas it's easy to shoot down an application or manuscript due to some weakness, even if there's something really interesting in it. It's super-extra-hard to come up with brilliant new vistas to explore when you simultaneously have to defend against a hoard of maybe-irrelevant criticisms.

If this dynamic is important (not sure if it is), the only way I see to address it is to somehow collectively limit the number of people who have research careers.

Expand full comment

Maybe amateur scientists are less common because our true leisure class is smaller? Even the children of oligarchs like Trump's kids pretend to flit around doing some kind of Succession thing, where in the past it was totally normal to own enough land to support your lifestyle and then go off on a hobby forever.

Expand full comment

You are never supposed to forage in the immediate vicinity of your camp. The area around your camp should be left alone for emergencies.

Expand full comment

machine learning might be inherently complicated a subject.

Evolution / Darwin took many years because there was so much work involved.

physics might have essentially been less complicated in 1920s.

generally, there is no clear formula for how complicated a field is Which is less related to how old it is

Expand full comment

And, of course, Darwin had several centuries of naturalistic observations and descriptions of life forms and life ways to build on. Without those, he'd have had little on which to base his grand theory. The theory is built on centuries of descriptive work.

Expand full comment

i tend to respect my doctoral advisor, and that means i tend to respect the economists he respects, including his doctoral advisor and (presumably) the economists his doctoral advisor respected, etc.

what if "geniuses" are just the genghis khans of science?

Expand full comment

In some ways the foraging analogy is apt but one thing it fails to capture is the inherent high dimensionality of the search space of scientific discovery.

Foraging primes mostly two or three dimensional intuitions but high dimensional spaces are a different beast so relying on those intuitions can be misleading.

Expand full comment

A couple years ago I presented a somewhat more abstract version of the low-hanging fruit argument that takes off from a 1992 article by Paul Romer (Two Strategies for Economic Development, 1992): Stagnation, Redux: It’s the way of the world [good ideas are not evenly distributed, no more so than diamonds], https://new-savanna.blogspot.com/2020/08/stagnation-redux-its-way-of-world-good.html.

That blog post makes up the second part of my working paper, What economic growth and statistical semantics tell us about the structure of the world, August 24, 2020, 19 pp, https://www.academia.edu/43938531/What_economic_growth_and_statistical_semantics_tell_us_about_the_structure_of_the_world.

The argument is about the relationship between the world itself and our cognitive capacities. Because the world itself is “lumpy”, rather than “smooth” (as developed in the working paper, but akin to “simple” vs. complex”), it is learnable and hence livable. The American economy has entered a period of stagnation because the world is lumpy. In such a world good “ideas” become more and more difficult to find. Stagnation then reflects the increasing costs of the learning required to develop economically useful ideas.

Expand full comment
founding

The 'rational equilibrium' for gatherers is that the depletion of any particular land should be the thing that makes gathering there equally good as gathering elsewhere. That is, (total time - travel time) * gather rate = constant. So if land 6 hours out is 50% depleted, land 9 hours out should be 0% depleted (so it's also worth 300 points); land 8 hours out should be 25% depleted (so it's also worth 300 points).

Expand full comment

Seems to me discoveries are not discrete objects that are gathered, rather that they're composites synthesized from existing states. This means each discovery adds a new state to the space from which increasing complex blends of discoveries can be synthesized. More of a network effect where every truth increases possible combinations. Eg, discover a transistor, and then other semiconductors become more likely to be discovered and many other uses and devices discovered through them, including an integrated circuit, etc. We're building a mountain not digging a hole, so there is a bigger and bigger pile to draw from.

Expand full comment

"Seems to me discoveries are not discrete objects that are gathered, rather that they're composites synthesized from existing states."

I don't think the basic argument depends on that. The analogy Scott uses, foraging, is couched in terms of discrete objects gathered by individuals. But the underlying argument could be couched in terms of epistemic agents searching for explanations or models. An epistemic agent could be an individual, but it could also a group of individuals, perhaps spread out in time and space, working on a particular problem. Each individual would offer this or that contribution toward solving the problem.

Expand full comment

"many of the most innovative crypto people (eg Vitalik Buterin) seem young, but that could just be a “crypto is cool among young people” thing."

It seems to me these my not be unrelated. It's entirely possible that crypto is cool among younger people, because that's a field where younger people are not disadvantaged.

Expand full comment

The model is wildly inappropriate. Science doesn't work like that at all. Scientific advancement almost never consists of finding a brand-new explanation for what has never had an explanation before. Generally, it consists of finding a *better* explanation in the face of (rarely) pure brilliant insight, and (almost always) the receipt of new and better experimental data.

For example, people had theories of combustion going all the way back to the Greeks, and probably back to Neolithic Man. The correct modern explanation was constructed by Lavoisier, replacing the prior "phlogiston" explanation, when new experiments on the oxidation of metals became available. Quantum mechanics replaced classical mechanics because of new data on spectra, not because nobody had any notions of how atoms worked -- they had ideas, some of which were of course wrong (or incomplete), but they fit the data then available, e.g. German organic chemists of the 1880s and 1890s had accurate and deeply insightful ideas of atomic valence, empirically derived, without the slightest idea of the structure of the atom from which they arose. Same with Newtonian gravity explaining the orbits of the planets, that replaced notions of perfect complicated nests of spheres *because* of the significant improvement in observational data of the planets brought about by the invention of the telescope. The notion of cells came about because of the invention of the microscope. Thermodynamics was formulated because careful measurements on heat input and output (undertaken because of the surging importance of the steam engine) produced new data. It's not that nobody had *any* theory of heat before Count Rumford observed cannons being bored 1798, people had a working theory of heat that was simply wrong, or more precisely insufficient in t he face of new data. And on and on. The first and most important engine of scientific advance is the invention of new instrumentation, which itself comes mostly from technological demands. When people are highly interested in new machines, to do new things (or do old things differently), then you get new measurements, new instruments, new data -- and, by and by, new theory. (Which traditional explanation is more than sufficient to explain the renaissance of "AI" ideas lately -- look at the massive technological demand for, e.g. better directions in Google Maps.)

The much better model of scientific progress is that it's a case of various areas of knowledge coming into sharper focus. We *think* we understand some area -- the movement of heat, kinematics, the way substances chemically combine -- and it works for a while, but then some new data cause us to need to sharper our focus, understand things in more detail, and sometimes even replace the major struts and beams that underly our understanding in general.

But this has nothing to do with any explanding circle of inquiry, and the metaphor that you have to travel greater and greater distance to find something new is wholly inapplicable. People have made stunning discoveries by simply looking where no one else thought to look (cosmic microwave background), or considering a possibility no one else thought to consider (RNA catalysis). They didn't have to master some much larger amount of prior material than people who made similarly remarkable discovering at an earlier age. Indeed, if anything, I'd say incremental discoveries at some far frontier on top of a huge complex pyramid of prior understanding are rarely deeply significant. Those are more the workaday nailing down of the details stuff that occupies the host of scientists in the 2nd and 3rd rank. It's discoveries that re-examine what "everyone knows" --- that velocities should add simply, that the flow of time should be the same for everyone, that if you mix two substances the resulting compound should have properties in between those of the components -- which prove enduring and powerful.

There have been plenty of times when people have observed a reduction in the pace of scientific advance, and concluded that new discoveries were just becoming intrinsically harder, cf. Albert Michelson's famous statement in 1894:

"The more important fundamental laws and facts of physical science have all been discovered, and these are now so firmly established that the possibility of their ever being supplanted in consequence of new discoveries is exceedingly remote... Our future discoveries must be looked for in the sixth place of decimals."

...which, of course, was also famously amazingly and hilariously wrong, inasmuch as it was made by a professional and highly accomplished physicist a mere handful of years before astonishing discoveries were to shake the foundations of physics.

The progress of discovery is never smoothy monotonic. We get waves of progress, then periods of stagnation -- sometimes centuries long -- and even in some cases lose ground. These things need no better explanation than human nature. The idea that if *our* time happens to be a period of slowdown or stagnation this must be because of some intrinsic essential difficulty in advancement, and not because our social structure, say, has hobbled human creativity to a greater extent than prior ages, seems to me fairly narcissistic, in the same sense as medieval monks placing us at the very center of the universe, and as the very origin of all its meaning -- e.g. the orbits of the planets or habits of wolves were as they were primarily *because* of how that affected human affairs. We're just not that special, not this species, and not this generation.

Expand full comment
Apr 1, 2022·edited Apr 1, 2022

I will say also I think a significant culture-wide increase in intellectual narcissism could *be* part of the reason our progress has slowed. When I compare the modern practice, and practitioner, to those who made significant steps in the 60s and maybe 50s (that being as far as my personal interaction goes back), one thing that *does* stand out is how much more relaxed and humble the older generation was. The modern practice of science is more uptight, more self-aware, full of a greater degree of social heirarchy, and worried to a greater extent about social status and status security. (And not just from within the field, but also from the wide world outside the field.) People are much more cautious about saying bold crazy and probably wrong things, particularly when young, although of course anything genuinely new *always* starts off seeming bold, crazy, and probably wrong.

You look at a guy like Elon Musk, who has certainly arguably kicked more than one area of technology in the ass, provoking a flood of invention, new data, new instruments (and maybe at some point new theory), and the thing that stands out significantly about his character is...he's not afraid to be wrong. He says crazy shit that has smart people shaking their heads, and he often *is* wrong. But he still says it, and acts on it, and the fact that he's *unusual* for our modern technological leadership corps says, I think, a lot about the times we live in.

Expand full comment

I think these are the best comment I have read in this thread thus far.

Expand full comment

I find this very convincing.

Expand full comment
Apr 3, 2022·edited Apr 3, 2022

Yes! (I went looking for your response, 'cause I knew you'd say much of what I wanted to say, but better.) I wanted to add two things.

1.) low hanging fruit depends on technology (yesterdays ideas turn into stuff.) You need glass optics to make microscopes and telescopes. Transistors and lasers make whole new areas open... new fruit.

2.) There is something about getting a group of smart people together that is better than all of them working alone. Having lunch (or breakfast) with Harry Nyquist was a good thing. https://en.wikipedia.org/wiki/Harry_Nyquist.

3.) Current stagnation. I think human narcissism has probably been pretty much constant. I think the problem is more with the funding, money, having a career. Every field in science has it's dominant model, and that paradigm sucks up all the research money, and to have some crazy idea, that goes against that model, is to not get funded... so no crazy ideas.

Expand full comment

I'm not convinced. On the previous article I couldn't find the right words for this to make a comment, but I think I do now:

Geniuses are interesting when they are uncommon. If they are rare, you are unlikely to hear about them. If they are common, then they aren't noteworthy. Since the time of Einstein, Bohr, Feyman, etc we've pushed hard to get students into STEM and so now STEM geniuses aren't a novelty.

---

Newton famously said "If I have seen further it is by standing on the shoulders of Giants." The underlying idea behind that is that his noteworthy insights were easier to find because of discoveries that came before him. Matt Ridley's book How Innovation Works talks about how most of the famous inventions with famous inventors aren't the product of some lone genius who saw something everyone else couldn't. Instead there were multiple teams working towards the same goal, but our society and legal system give all the credit to one team or person. Sometimes the teams were a single person, sometimes a group, sometimes it was an entire crew (eg Edison).

I think this concept of some innovations make others easier to discover is missing from your village metaphor. Maybe instead of picking fruits from trees, the villagers are picking berries from dense pricklebush. As you pick the berries, you cut away the vines making it easier to access the further berries. On day 1 picking a berry 100m away might take one hour of pushing thru dense and prickly brush. On day 105, it may be an leisurely 1 minute walk.

Expand full comment
Apr 1, 2022·edited Apr 1, 2022

Over the past two years of the pandemic I noticed several odd behaviors of SARS2 spread and inter-variant competition that seemed to defy standard epidemiological models. With each new wave of VoCs we'd see some strikingly different epidemiological behavior in different countries. I'm just a scientifically well-educated hobbiest when it comes to pathogens, but it seems like there were all sorts of green-field research activities and cross-speciality investigative opportunities that researchers weren't interested in following up on. Many times these phenomena went unnoticed, or, if they were noticed, they were explained away by NPIs and such. There may have been lots of low-hanging fruit out there, but either because current research is so siloed, or because the grant-making agencies are giving out money to established researchers who have been investigating one thing for their entire career, this low hanging fruit seems to have been ignored (at least for SARS2).

Expand full comment

The model is too simplistic.

The world population in the 1800s, say, was 1 billion people. There are 7.5? 8? now. From a pure warm body standpoint, it would be 7.5 to 8 times more competition.

But this is the 100,000 foot level.

Science is a very luxury activity. It rarely, if ever, provides tangible first person benefits to the scientist in their own lifetime. The only exception is our present period (more later).

So even if we look at the 1 billion, how many are actually able to take on the luxury of being a scientist? 10,000? less?

Now luxury/lifestyle vs. profession. The ability to make a career out of being a scientist is probably no more than around 2 generations old. Prior to World War 2, I am fairly sure all the scientists were inventors/kooks/curious/etc - but there just weren't jobs to be a scientist per se. There were jobs to be an academic - but not all academics do science and academic mainstream thinking and science have, at best, a fraught relationship.

So is science today better with professional scientists?

Clearly it is better in the sense of developing known paradigms.

Not so clear if it is better in generating new paradigms - the risk for a scientist, professionally, to deviate from expectations now has not just a social, but an economic penalty.

We'll see.

Expand full comment

The fact that the foragers are living on a 2d surface is doing a lot of work here, and if we get rid of that then the argument kind of falls apart.

Scott has previously written a lot of things comparing human thought to the way artificial neural networks function, so hopefully it won't be too controversial when I say that an idea is very very roughly like a point in an extremely high dimensional space. (Neural network activations are very high dimensional vectors, and we're saying that they kind of correspond to "ideas" or "things the network is thinking".) We should probably think about scientific ideas / discoveries in the same way: they are ideas that happen to be good in a very large and high-dimensional space of possible ideas.

If the foragers are foraging in a very high dimensional space rather than in 2 dimensions, then they'll pretty much never run out of fruit to pick that's a very short distance away. The front of discovery will expand quickly at first, but will soon slow almost to a halt. In this later stage, if a forager increases the distance he is willing to walk by just a couple of steps, then suddenly whole vistas of millions of fruit trees become available, enough to spend a lifetime picking, without ever coming close to running out.

Expand full comment

"If the foragers are foraging in a very high dimensional space rather than in 2 dimensions, then they'll pretty much never run out of fruit to pick that's a very short distance away."

Doesn't that depend on how the "fruit" is distributed in the space? They may not be evenly distributed. They might be distributed in clumps with relatively large empty or sparse regions between clumps.

Expand full comment

That's true but a very high dimensional space will more than compensate for that. Of course high dimensional spaces are very hard to navigate so, for example, you may still be better off walking ten steps along a well worn path to forage even though there is almost certainly a vast undiscovered treasure trove only three steps away.

Expand full comment

Of course, the foragers will become stuck if they don't realize they can move in other than the two dimensions.

Expand full comment

I'm interested in cataloguing ten-thousand-foot answers like this, if anyone wants to try to succinctly state ones I've missed (https://muireall.space/progress/).

Expand full comment

So moving camp is like a paradigm shift ...

Expand full comment
Apr 2, 2022·edited Apr 2, 2022

In this foraging model, one-on-one tutoring is equipping some foragers with a bicycle with a handlebar-mounted spotlight. Grad school is a network of trams, with the rails going near previous large finds.

As the tram network grows, resistance to growing it further increases, as does the cost.

Expand full comment

Could it be in part motivational? If we tell the young people with the greatest minds that science is no longer about inducing fundamental, sweeping truths about the world, but merely about falsifying hypotheses, then maybe “science” seems to them like a waste of their potential.

Expand full comment

Also, we tell the ambitious kids with the scientific potential (at least in math and computer science) that they should work for some company and make money.

Actually, Newton (or someone like that) also spent a lot of time doing horoscopes, so it's not like people in the past did not need a daily job. But Newton, while doing the horoscopes, probably had more opportunity to also do actual science, than someone working at FAANG today. (Especially with all that "all your inventions belong to us, even if you make them in your free time" shit.)

Expand full comment

Hasn't fame and fortune has been one of the big drivers of scientific discovery? There is still plenty of terrain to explore, but a lot of the really important and pressing problems have been picked over. Building the first aircraft was a dream of many inventors. Creating a tweak for a slightly more fuel efficient flight doesn't hold the imagination in the same way.

For amateurs, Coley's toxins is a good example. Coley became the father of immunology by injecting people with bacteria. In some way he was able to move the field forward, even though he had no understanding of the mechanisms. This seems less likely to happen today.

Expand full comment

It's not just about practical applications. Once the periodic table is discovered, it can't be discovered again.

Expand full comment

Scott, in this debate I am 99% on your side. Yes, it is relatively easier to be a genius when all the low-hanging fruit is still hanging there. Today, if you discover something amazing in math, chances are that someone else already made the same discovery decades ago, you just didn't notice it because there are too many things like that. For every obscure part of science that has existed long ago, some people spent their entire life commenting on everything they noticed, and it is hard to even find out something they didn't notice. Which, if I understand it correctly, is what you are saying.

The remaining 1% is the argument... sorry, too lazy to find the link now... that people today lack the *ambition*. Yes, many people homeschool and hire tutors, but very few of them do it with the explicit purpose of making their child a *genius*. Most of them simply aspire to get the child admitted to a prestigious university.

Consider the difference between Polgár -- who decided that his girls are going to become chess grandmasters, and organized their childhoods accordingly -- and some modern parent who merely want his kids to attend an afternoon chess club, because they believe that this kind of extracurricular activity will make their child more likely to get to Oxford.

I would predict that given the same levels of innate talent, the latter children will most likely *not* become chess grandmasters. Not because they couldn't, but simply because they will not spend enough time and effort. (They will probably also be actively distracted by doing *other* things that might also increase their chances at Oxford, like riding a pony or whatever.) This is the difference between trying, and trying to try, so to speak. Instead of striving to become stronger, you are just trying to project an imagine of a "hard-working student"; you are not really trying to become the best in the universe, because even the hard-working students are not socially expected to actually do that.

Unlike the aristocrats in the past who hired tutors for their kids, the rich (enough to afford any amount of tutoring) people today are not trying to make their kids as great as possible. They are okay with making them good enough. (And sometimes it is easier to just donate some money to the university.)

I did a lot of math tutoring when I was younger. People hired me to teach their kids that were failing at school, or who were trying to get admitted to some school. No one has ever offered me money to teach them or their kids anything *beyond* what the school required. I have a gold medal from international mathematical olympiad, and I am also interested in psychology and pedagogy; that could in theory make me qualified for this kind of tutoring. But there is no demand on the market, as far as I know.

Expand full comment

> No one has ever offered me money to teach them or their kids anything *beyond* what the school required.

Okay, to be more precise, this was true about *math*. In computer science, sometimes people are interested in things beyond what they immediately need for their school or job.

Expand full comment

Today there are more people on earth, and far more of them have the means to pursue very specific interests, so I am skeptical at your claim that today fewer kids are being trained in very specific ways.

Expand full comment

I'd need more data. How many past geniuses started out aiming to make major advances? How many had highly ambitious parents?

I know there's thought about finding important problems to work on, but even that doesn't reliably produce geniuses. (Anyone have the link? It might have been someone at Bell Labs.)

Expand full comment

I suspect the Bell Labs thing you're thinking of is https://www.cs.virginia.edu/~robins/YouAndYourResearch.html

Expand full comment

Wolves DO NOT eat people. participating in the idiotic anti-factual demonization of wolves is pathetic and disgusting, even as a metaphor. shame on you.

Expand full comment

I'm genuinely curious, why do you think it's so important to defend wolves in such a contemptuous manner? I could see a comment like yours written more as a "fun fact, wolves don't actually eat people contrary to popular belief" but your invective is making me wonder if you've got some sort of personal stake in how wolves are perceived.

Expand full comment
Apr 2, 2022·edited Apr 2, 2022

Why do I think it is important to defend a species (that genetically includes dogs, our own species' "best friends") that has faced centuries of hateful lies, ignorant demonization and hideously evil eradication efforts by selfish morons who think this world is theirs to do with whatever they please? the better question is why do you think those who want to exterminate a beautiful and ecologically-critical species, and those who would casually use metaphors based on nothing but ignorance and lies deserve anything other than contempt? I do have a personal stake in it, because I am an ecologist and evolutionary biologist, so I understand how ecosystems work and that despite the hubris of mankind, how we still rely on ecosystems for our own survival. Just look at all the wolf-eradication laws being enacted in western states where I live, and the deluge of rabid anti-wolf vitriol, and then tell me my admonition is unwarranted.

Expand full comment

Contempt smears the person spewing it, not the person it's directed against. Maybe wolves are as important as you say, but the manner in which you express your values leads me to doubt whether or not to accept such a valorization from you.

Expand full comment
(Banned)Apr 5, 2022·edited Apr 5, 2022
User was indefinitely suspended for this comment. Show
Expand full comment

If your goal is to persuade people that wolves really matter, and that Scott's post harms them, then you've failed with me precisely because of your contemptuous manner. Maybe that isn't your goal, though. In which case, sure, my lack of acceptance is immaterial.

Expand full comment
Apr 6, 2022·edited Apr 6, 2022

I didn't ask for you, whoever you are, to reply to my comment. Your very existence is immaterial, not merely your acceptance. "Scott" used a wildly inaccurate and contemptible metaphor, and I voiced my informed indignation. Then YOU decided you just couldn't keep your mouth shut and proceeded to peddle your un-sought advice, which was itself contemptibly patronizing. you have well-earned my contempt, you prick.

Expand full comment

Wow, you are so insensitive to the suffering of Werewolf-Americans. Why is this kind of bigotry tolerated at ACX?

Expand full comment

put some effort in at least, that joke is just lazy. You can do better, I believe in you kid.

Expand full comment

The most extreme example of low-hanging fruit should be reading the books of lost civilizations. The greatest Roman scientists—Hero, Galen, Ptolemy—are the ones who read the Library of Alexandria, and yet they are pale reflections of their sources. Archimedes was the main source of inspiration for science for almost 2000 years. And yet, even with the hint given by reading Archimedes, the golden ages of Roger and Francis Bacon were substantially slower in reconstructing Hellenistic science than the 200 years it took the first time. It's not entirely clear what Hellenistic science accomplished, but probably the 17th century accomplished nothing outside of it, and much less than half of it. And that's a lot better than any prior attempt to extend Archimedes!

Expand full comment

While the general model is hard to refute and makes perfect sense, the scaling seems to boggle the mind. Keeping in mind that technology expands so rapidly, it seems to stretch the analogy that the foraging would truly be THIS much more difficult each generation. Are we really going from dozens of barely trained amateurs being geniuses each generation to almost none?

I think the answer here has to be more complicated, if only to explain how much noise gets into this signal in the first place.

My personal hobby horses may bias the following example:

In the times of Ben Franklin and Lavoisier, it was common to have lots and lots of leisure time if you had any at all. Aristocrats were famously bored and most developed serious hobbies. Those who became scientists had more time for different kinds of creative thinking than we can justify in academic careers today. Freed from pressures to publish, teach, or even make sense to anyone else, they could think in a manner unavailable to most today. Intellectual Minor Leagues weren't limited to politics. You got in the major league by coming up through the minor league at least as often as through classes. Many others make the point about the pressure to publish being bad for scientific rigour in many cases. But think also about how it shapes the thought process of a scientist searching for truth. Think about the change over time in scientist labour division between administration, experimentation, collaboration, and quiet thinking. I believe if we calculated the hours lost in a scientific career today to nonproductive administrative time the curve would flatten a little. Including time lost to trying to publish papers that turn out to have irreproducible results, and it may be a more dramatic effect.

One parallel worth noting in the wider society is uses of leisure time. Leisure time for the general population is at an all time high, but so are entertainment options. Media consumption is at an all time high too. This leads to a lot of lost time each day that could be spent on hobbies, investigation, and thought. It certainly affects the contemporary aristocracy, almost none of whom are amateur scientists. It certainly affects me. I have a paper to finish by the end of this month and i am behind. I think it is not a stretch to say a vastly more distracted world will produce less signal and more noise.

Expand full comment

My Exhibit A for the youth thing in ML is Ilya Sutskever (36). He is a co-founder of OpenAI and a co-writer on: AlexNet (groundbreaking convolutional neural network), Transformers, GPTs, and AlphaGo. The Bay Area Rationalist scene is also chock full of world-class ML nerds who are in their late twenties.

Also, "young" might have to be re-calibrated to the typical age for a prodigy to get a Ph.D., and "old" should also be re-calibrated since everybody's living longer.

Einstein once said, "A person who has not made his great contribution to science before the age of 30 will never do so." Probably it's 35 now, and maybe a hundred years before Einstein it was 25, assuming what he said is true.

Expand full comment

I would like to suggest an alternative explanation, that I’m borrowing from David Deutsch, there’s a possibility that genius is less common since the late 1800s/early 1900s because of bad philosophy:

“ Let me define ‘bad philosophy’ as philosophy that is not merely false, but actively prevents the growth of other knowledge.”

Most scientific fields, and even popular culture, was corrupted at that time by the philosophical anti-enlightenment, where ideas become arbitrarily relative.

In such an environment we’d expect to see both very little actual progress, because science is now more focused on instrumentalism than explanations, leading to an infinite regress of vague theories. And a lot more tall poppy syndromes, because if there is no standard of beauty anything is art, and no one should be labelled a better artist than anyone else.

I don’t know if this idea has already been discussed, so sorry if I'm being redundant. Here’s David’s explanation of bad philosophy’s impact on physics: https://publicism.info/science/infinity/13.html

Expand full comment

Seems like positivism should be taken "seriously, but not literally". The good part is that science should indeed be grounded in observations. But if you take it to the extreme, then it is "unscientific" to talk about anything that is not your immediate perception. (Then we could nitpick about what kind of perception counts as sufficiently immediate: is it okay to use a magnifying glass, a telescope, a microscope, an x-ray...)

In real life, of course, there will be different burden of proof on popular and unpopular theories.

Expand full comment

That is actually a good description of instrumentalism, and not popperian falsification that Deutsch (and I) endorse.

All observation is theory laden. First we have explanations, then we can observe something. Which is why we can only falsify, or engage in confirmation bias. This makes what is perceived a part of the theory. Not some objective standard.

What I think had happened in the last 100 years is that people have ended up endlessly debating the possible truth of P values instead of creating explanations that are falsifiable, and making real progress.

Expand full comment

Interesting model. But it doesn't seem to include the sometimes revolutionary questions that earlier answers can pose. In other words, scientists eat the low-hanging fruit and poop out seeds that grow new fruit trees in the proximal area.

Expand full comment

I've posted an alternative model in response here: https://www.lesswrong.com/posts/WKGKANcpGMi7E2dSp/science-is-mining-not-foraging

Cross-posted from LR:

Tl;dr: My model for science is that it is like mining outwards from a point. This offers predictions that extend beyond Scott Alexander’s foraging metaphor and sometimes disagree. The mining metaphor emphasises the fact that research exposes new research problems; that research is much slower than learning; and research can facilitate future learning. The model offers actionable advice to researchers (e.g., “avoid the piles of skeletons” and “use new tunnels to unexplored rock-faces”). In part II, I also object to the idea that it is concerning that a smaller proportion of people are now “geniuses”, arguing that this would be true even if the importance and difficulty of intellectual feats achieved was constant because of the way in which genius is socially constructed.

Expand full comment

I know the virtue of the model was its simplicity, so I feel bad about proposing an addition, but here goes: We're not modeling *satiety*, which I think is real, and explains a lot of the slowdown we observe. Basically, the most scientifically productive regions place comparatively less priority on progress because there's peace, or at least, no life-or-death sci/tech race, and because our basic needs for the fruits of technology are mostly satisfied most of the time, which leads us to prioritize goals other than progress. Even actual scientists and the institutions to which they belong clearly now put more emphasis on making sure that scientists have comfortable jobs, predictable salaries, prestige, and fair treatment. This selects for people who can convince grant committees and peer reviewers, not necessarily those who burn with larger ambitions. The latter generally try to make it big in industry, but the sort of industry that caters to largely satiated people tends to spend its collective brainpower on making people click on ads for dog accessories. So I think there is a general lack of urgency for real progress, and a corresponding decline in concerted effort.

Expand full comment

One other field to consider in this analysis is religion. More precisely, religious research. Israel now has 150,000 Yeshiva students learning full time, often as a career. But the pace of residue discoveries (important Halachic innovations) has not sped up.

Expand full comment

> I haven’t done the work you would need to distinguish between these two explanations yet, although I find it suggestive that the trend is more pronounced in theoretical physics than in biology.

This link doesn't at all mention biology. And I'm not sure what you mean with "the trend is more pronounced" — but my impression was that more theoretical and mathier sciences (like theoretical physics) typically have younger people making contributions, mostly because fluid intelligence peaks during people's 20s.

Expand full comment