That cuts against the observable evidence, however. If women and ethnic minorities entering the sciences allows for increased discoveries we should expect that the rate of discoveries has increased over the past 100 or so years as more women and ethnic minorities have gotten into the sciences. That does not seem to be the case; the correlation would go the opposite way.
"[W]e should expect that the rate of discoveries has increased over the past 100 or so years as more women and ethnic minorities have gotten into the sciences. That does not seem to be the case;"
What? You don't think the rate of "discoveries" has increased over the last 100 years?
I'd say that is a ridiculous statement, but perhaps perhaps I do not understand what you mean by "discoveries" and the "rate of discoveries".
I'd say that last 100 years represents that most significant increase in the rate of discoveries in all of human history.
You might have been missing the thread of this and the previous posts. We are not generating more geniuses in the sense that big break throughs are not happening as much per unit of scientists or time or whatever metric you want. It takes more scientists more time and more money to advance things now than 100, 200, 300 years ago. The discussion is about why scientific progress is slowing down, not why the rate of scientific progress is increasing.
If you have evidence that the rate of scientific progress is increasing I expect Scott and everyone here would love to see it. I would.
The original comment is deleted, but looking at the rate of discovery per scientist feels like the wrong way to assess whether opening up science to women and minorities led to more progress. The way we'd expect that to help would be by increasing the number of people who are scientists (and possibly increasing the average quality of scientists if we think that demand for scientists is inelastic and greater supply pushes up required quality).
This is hard to measure, but Eric Hurst & friends have a paper arguing that something like 20-40% of economic growth since 1960 can be attributed to opening up professional occupations to women and minorities.
It is totally possible that things would have been worse had women and minorities not been allowed to be scientists, and the rate of progress would be even lower than it is. One has to make that argument though. The original comment did not, but rather said that it has been a big boon. If it has been a benefit, that benefit has apparently been overwhelmed by negatives.
Economic growth is very different from scientific advancement as well. It is much easier to produce more stuff by putting more people towards, especially because it is easy to move from low to higher productivity occupations for people in terms of knowing which is better. Higher pay pretty consistently suggests higher productivity. In science it is really hard to do that. You don't see really smart biologists changing careers to become political scientists because that is where all the productivity is. (Arguably lots of failed mathematicians becoming economists has made the field worse, but that's another issue.) So even if it is true that a third of economic growth is attributable to bringing in women and minorities, which I would buy as reasonable, it isn't at all clear that it should likewise apply to science.
I totally agree with you that improving economic output by increasing the number of people available for high-productivity occupations is much easier than increasing scientific progress in the same way. And, of course, measuring productivity is much easier than measuring scientific progress. I'm also happy to believe that the deleted comment was totally wrong :).
One small thing I'd still push on though is that, in my understanding, scientific progress per se hasn't necessarily slowed down. My understanding is that the _number_ of scientific advancements per year has, in fact, increased rapidly. With the example of Moore's Law, increasing a constant rate of increase on the number of transistors per chip means that the number of chips added to transistors in each year is dramatically increasing. Likewise for crop yields. Likewise for the number of patents and research publications per year.
The issue is twofold: first that the rate of increase has slowed down across many scientific domains, and second that the number of researchers employed to produce that rate of increase has increased dramatically (so that, as you say, discoveries per scientist have fallen).
My point is just that the way we'd expect expanding the talent pool in science to be helpful is that it would allow us to add more researchers, not that it would make those researchers dramatically more productive. If anything, we'd expect opening up research to more people to make the average researcher less productive if we have diminishing marginal returns to research effort. So: expanding science to include women and minorities could very well be a dominant factor in our ability to maintain scientific progress over the pas half-century without it having had any positive effect on the difficulty of creating scientific advances.
And if the claim was grounded in something like the process of growth that Geoffrey B. West writes about I make take it more seriously.
And if someone suggested a coefficient like "what we know" divided by "what we know we don't know", I would certainly entertain the notion that we (humanity) have been in a constant state of getting dumber.
But if we divide human history into let's say 35 chunks - maybe ~5000 year periods, I would say there is no evidence that the rate of scientific progress could in any way be said to be decreasing.
The number of journal articles has increased exponentially.
I believe the number of significant discoveries has declined dramatically since about 1970. Certainly the life of the average American has changed less in the past 50 years than in any other 50-year period since America was "discovered".
Mm, i was under the impression that the idea that most hunter-gatherer societies were egalitarian is pretty discredited, differently than what implied in the presentation
No idea, but I would assume that hunter gatherer societies would find it harder to develop the large economic surpluses necessary to support a hierarchical society.
I was delighted to find out that the opening weekend of deer hunting season in Michigan is an especially prosperous weekend on the Magnificent Mile in Chicago because it has become traditional for the wives of deer hunters to flock to the stores to do some expensive gathering.
Rather than political effects or the mechanical effects described herein, I think there are also effects relating to ideology: amateur scientists don’t exist as much because people don’t believe amateur thinkers can provide value anymore, and so they don’t even try. This is distinct from the aspect of only believing credentialed figures when told things, though it is related.
There are strong arguments to be made that a number of scientific fields are wrongheaded in some fashion. In the 1980s, doctors were telling people to avoid SIDS by having babies sleep on their tummies, and now they insist quite strongly the *exact opposite is true.* Numerous “paradoxes” around infinites seem to indicate, at least to some, that maybe we are on a false assumption or two there. Professional physicists have failed to reconcile GR and QM for decades.
The mechanistic model here doesn’t address the “revolution” problem of science: where some philosophical or other assumption is overturned by some “brilliant” new idea (ones that can often be more common with amateurs than professionals - Einstein being a patent clerk is a good example.)
It's not enough that current fields are wrong in some way, it has to be a way that an amateur has some real chance at correcting. I don't know anything about SIDS, but I expect it would take a lot of work for an amateur to even properly understand the open questions when it comes to infinities or quantum gravity, having occasionally tried to understand these myself, and that without this work there is no chance of them making a contribution.
What paradoxes do you think infinities have that modern mathematics fails to resolve?
Whilst I'm not qualified to write of hubris in the medical field, there are a lot of 'boots on the ground' folk who make a lot of good observations, but those folk are not Dr. Fauci level, thus their observations get blown off.
For instance, I work with a guy (non-medical) who ran a COVID testing clinic for a while. Today, he's a true believer in masks—I think he was luke-warm before. But he says "I checked in 91 positive cases in one day, and didn't get sick."
I think it's going to depend a lot on the field. All fields will have some degree of politics to them, because humans are human, but in some fields approximately *all* the barriers are political (eg. many humanities), whilst in others, the knowledge barriers are extremely steep (eg. maths, theoretical physics), or the financial ones are (many experimental sciences, but especially medicine).
Personally, I studied experimental particle physics - while knowledge is somewhat of a barrier there, the biggest obstacle by far is that getting any useful data requires massive machines that take hundreds or thousands of people to build and maintain and cost potentially billions of dollars (I think the LHC is more expensive than most experiments in the field, but the others aren't cheap and they're complementary - you can't do what the LHC is doing without spending billions.) Theoretical particle physics, by contrast, needs little more than a good computer, but the maths is brain-meltingly difficult and all the obvious ideas have been published decades ago.
> amateur scientists don’t exist as much because people don’t believe amateur thinkers can provide value anymore, and so they don’t even try
There are people like this, the trouble is that they tend to be cranks. Cranks are characterised by an excessively strong belief in their ability, as amateurs, to make great scientific discoveries.
Is there a sweet spot between academic insider and kooky crank where you can still make important discoveries? Well maybe, but in most fields you're going to have to spend a lot of time catching up with the current state of the art before you can even find a problem worth thinking about.
I think another cultural piece here is that it only seen as allowable for cranks to contribute. Sober/serious people just focus on their businesses or what have you rather than trying to email Sean Carrol or what have you.
Yes, but the very amateur discoverers were probably seen as cranks by their peers. What do you think of some dude who spends his time polishing little glass beads and and trying to look through them? That's van Leuwenhoek trying to make a compound microscope ... but to his neighbors, he's some crank who polishes little glass beads.
Regarding your SIDS example, I'm no scientist, but I was a defense lawyer long enough to see that medicine is science plus something that isn't science.
Claiming certainty about the human body even in a particular case is usually stating too much.
Certainty about general health advice is so far removed from science it's better seen as akin to trends in fashion.
It is really difficult for an amateur to understand state of the art in a specialized discipline, let alone improve on it. You need instruction from people who'll correct your misunderstandings and point out helpful literature, at least, and at that point you're in a PhD program.
Perhaps the premodern equivalent of a PhD program was just chilling with your philosopher buddy, but that is no longer enough.
> amateur scientists don’t exist as much because people don’t believe amateur thinkers can provide value anymore
I think this is true. I also think that this belief is almost certainly correct.
I have an academic position in a world-leading astrophysics department. As such, I receive a LOT of unsolicited theories from amateur thinkers, almost all of whom believe that they have produced new insights which will push physics forward. It's always very high-concept stuff, like using quantum theory to do away with Dark Matter, or combining electrodynamics with relativity to disprove the Big Bang. Etc.
I do tend to read them fairly carefully, and without fail these authors have a very poor understanding of the topic at hand. The best of them make silly mistakes which would be obvious to an undergraduate; the worst of them seem little more than schizophrenic ramblings.
I think we're past a point where amateur researchers can usefully contribute. If you want to do cutting edge science, you need a more experienced scientist to show you the layout of the field, recommend relevant literature, and (importantly) correct your wrong ideas. I was in a top PhD program, and almost all of my cohort had the experience of coming up with what felt like a smart new idea, and being told 'actually Smith et al. tried this back in 1978, and it doesn't work because [x/y/z]'. Without this guidance, even smart people are going to be hopelessly lost. And once you get this guidance, you're no longer an amateur.
Just to chime in, I think the same is true in biology. A lot of resources needed even to answer relatively simple questions. To answer hard questions, you need an extremely strong foundation based upon mentoring and a lot of study that an amateur would have a tough time achieving.
I think this is probably more true the harder/more scientific gets, but it certainly doesn't hold for everything. A friend of mine is a mechanical engineer, and has published some linguistics work that has legitimately pushed forward the frontiers of knowledge regarding the language in question, even in the estimation of professionals studying the same (and related) languages.
Linguistics may not be astrophysics or biology, but there's still rigor to it, and it's still possible to be obviously wrong in a way that isn't true of, say, philosophy.
Sliding more toward the philosophy end of the scale, one of my favorite pieces of history writing (Six Frigates) was done by an amateur, and it's well regarded by scholars, too.
Both speak to the accuracy of the foraging metaphor in different ways, I think.
I suspect that what you really want is someone who is outside this discipline (so they don’t share all the same starting assumptions as everyone else) but has been trained in another discipline (so that they have the discipline of thinking rigorously).
I agree with this, but note that some disciplines still just have insane amounts of existing knowledge to absorb first. I think people with the properties that you describe are far more likely to come up with genuinely good ideas than most amateurs, but will still meet the rebuttal "actually Smith et al. tried this back in 1978, and it doesn't work because [x/y/z]" fairly often. Coming up with good ideas is only one part of the problem, they also have to be novel.
I think there may be more wiggle room in the humanities for this; you can be an amateur sitting at your desk going in minute detail over old publications and chasing trails along new lines of thought as a secondary interest to your main job. There is still room for a Michael Ventris in these fields:
Once scientific advance has gone beyond "work it out with a pencil and paper", you really can't do that on an amateur basis; as noted above by several, you need the labs for the practical work and the advisers to steer you away from dead-ends.
If you’re going to be the amateur genius who deciphers Linear B, it helps to have an Alice Kober do 20 years of heavy lifting on the project before you get started.
Kober was a Columbia-educated professor of Classics who spent nights and weekends for decades doing the kinds of frequency studies you could now do in seconds with a computer. She made 180,000 index cards.
Kober collaborated with other specialists, but didn’t publish about her work on Linear B until she’d been working on it for 15 years. She won a Guggenheim Fellowship to devote herself to the problem full time for a year. And then, perhaps on the brink of cracking Linear B, she died. Michael Ventris and his collaborators inherited all the resources she developed, which they acknowledged.
It’s not like Kober wasn’t recognized, but Ventris got the lion’s share of the fame while it could be argued that Kober, who was no amateur, did the lion’s share of the work.
Sure—he's been doing comparative linguistics between documented varieties of the Nivkh language (or Gilyak, in some sources; it's nearly extinct but was historically spoken in the Amur basin and on Sakhalin Island).
"Application of the comparative method to vocoid sequences in Nivkh" should get you going in the right direction, if you want to know more. Most (all?) of his Nivkh-related output is in Kansas Working Papers In Linguistics, but he did get a paper on Quechua into IJAL.
Einstein might have been a patent clerk, but he already had university education and physics and was also working on a PhD dissertation while working; Einstein is a good example of a young person creating a revolution, but not a strong example against credentialism or anything. (I'm getting this info from https://en.wikipedia.org/wiki/Albert_Einstein#Patent_office). Also, his patent clerk job was still actively involved his science background.
Sort of. "Physics grad student" is a weaker credential for competence in physics than "physics PhD", but it's a stronger credential than "random patent clerk", and physics grad students are taken seriously as producers of physics research in a way that patent clerks with no particular formal background in physics are not.
For example, I'm pretty sure that today, reputable physics journals publish a lot more papers written by physics grad students than papers by clever laymen.
Someone "working on a PhD dissertation" has already qualified for at least a Masters, even if one refuses to factor in the x% of a PhD.
PhD students are actively expected to produce and publish novel research, it's literally a requirement to graduate; it's just that these days "incremental improvements" are proportionally even more of the publications than ever before.
I think the more important question is, are there no longer opportunity (and/or motivation) for brilliant 20-somethings to earn a living (in a job that fits their education), work on PhD dissertation, and still have enough free time to let their mind wander and think about fundamental problems of the science?
The most common complaint I hear is that if you are a PhD student working in "a lab", great deal of your time goes to either teaching assistant duties or making experiments your professor instructs/orders you to do.
People call it an master - apprentice model of science education and think it sounds nice and old-timey. Very few people remember that apprenticeship was often grueling work done on very unfavorable contract terms. Benjamin Franklin ran away from his brother's print-shop, and founded his own in Philadelphia (and became successful). So did Rousseau, apprentice to an engraver in Geneva, who ran away and found himself a protegé and later lover of random French noblewoman.
It varies by country and by discipline. In my case, all my time was for research towards my thesis, though classmates who weren't still living with their parents spent a not insignificant amount of time teaching undergrads for money. However, doing research directly necessary for my thesis was a more-than-full-time job, so idle chats about the great mysteries of the field were rarer than I'd have liked.
By contrast, my girlfriend did a PhD in literature, and describes a positively relaxed workload.
In the Days Of Discovery, a scientific education—any education—was available to only a very few. Thus Age Of Discovery people whom we consider amateurs' were the very people who would have vocational degrees in today's world.
One other place where this theory would predict a difference is in the artistic domains. Since we explicitly value novelty, we don't run out of easy melodies like we do easy scientific discoveries.
Music fits this theory in some ways (the most acclaimed artists are young) but not in others (to succeed you need to dedicate all your focus).
Unfortunately, these areas are subjective so determining decline is impossible. But if decline is real we would expect a steady decline in artistic greatness over time.
Though modern instruments also represent an unforaged area. Beethoven did not have a turntable, Bach did not have an amp. And I don't have stats on this, but I'd suspect modern classical composers (movie composers?) tend to middle-aged.
I think the degree to which novelty is valued in the arts is overstated. It's valued by (some) critics; not so much by the wider population. At the very least, insofar as they care about novelty, the audience only care about novelty relative to what they've experienced before themselves, not whether something is objectively novel, so the good ideas can be mined again and again with each generation, rather than being depleted.
You can't be considered a genius in the hard sciences by rederiving all of Einstein's equations; John Williams can become one of the most celebrated orchestral composers of his day by (masterfully) emulating the techniques, and in some cases tunes, created by his elders like Gustav Holst and Eric Korngold, in part because most moviegoing audiences who latched on to Williams had never really heard much Holst or Korngold.
(Maybe the argument is a bit facile, but it's easy to construe the much-discussed remake/adaption obsession of cinema and television in the 21st century as the ultimate consequence of this. All the 'best' ideas, in terms of pass appeal, *have* been found, so now they're just getting recycled over and over instead of barrel-scraping for new ones. A simplification, I think, but I believe it does point at something real.)
My problem with this model is that human genius should be at least semi-constant on at least a per capita basis if it's primarily genetic. If it's primarily environmental then you should expect to be able to produce it in a way we haven't been able to. If it's a combination (like I believe) then you're waiting for the right environmental conditions in which genius can express itself.
However, this has a marginal effect. In the worst conditions one or two geniuses will shine through. In moderate conditions a few more. In abundant conditions many. But once you have many geniuses it makes sense to specialize. When you're one of twenty scientists in England then it makes sense to do five things and to make foundational but ultimately pretty rudimentary discoveries about them. When you're one of twenty thousand then it makes sense to specialize in specifically learning about... I don't know, Ancient Roman dog houses. This creates more and higher quality knowledge. But it creates less towering geniuses.
Further, keep in mind you don't have to outrun the bear, you just have to outrun your competition. You can get a lot wrong and so long as you're relatively more correct you'll do well. This also explains how amateurism decreases. A few hundred years ago I'd probably be able to make some serious contributions to a variety of fields. Now I can't. Not because I do not know those fields or have interesting thoughts about them. But because now I don't have to contend with a few widespread amateurs. I have to contend with several thousand people who have spent their entire lives studying whatever as a full time, professional job.
This all seems reasonable as far as it goes, but maybe the impression that we are producing fewer geniuses nowadays is more due to relatively contingent and parochial-to-humans facts about the sociology of fame assignment within groups (as hinted at briefly in point #3) than it is about great feats of insight, or what make for good examples of creativity or impactful problem-solving or whatever.
In the forager analogy, the other foragers considering the problem of who finds good fruit sources are only able to consider foragers that came to their attention in the first place, and that could be due to reasons other than the actual fruit-harvesting (especially if the fruit harvesting had impact as hard to quantify in isolation from frame as does that of scientific genius).
>maybe the impression that we are producing fewer geniuses nowadays is more due to relatively contingent and parochial-to-humans facts about the sociology of fame assignment
Right. This debate seems to actually be about "What makes geniuses celebrities?" One thing that helps make geniuses celebrities is having been born before the 20th century. 20th and 21st century media have made it harder for geniuses to compete with non-geniuses in celebrity space.
How much of our definition of "genius" depends on celebrity, though? Certainly there have been some famous-with-the-general-public-in-their-time scientists, but would a lot of the people we now label as geniuses get recognized on the street when alive?
Well, "fame within groups" doesn't necessarily imply fame-while-alive, or fame-with-the-general-public. For whatever that is worth. Scott's Example #3 was contemplating the politics of small research groups.
My point is that our awareness -- though not our definition of -- genius depends on fame. How can one person take stock of the number of geniuses across hundreds of fields without using fame as a heuristic?
> would a lot of the people we now label as geniuses get recognized on the street when alive?
I suspect many of their names would have been known by upper and upper-middle-class people. All the examples cited in "Contra Hoel": Newton, Mozart, Darwin, Pasteur, Dickens, and Edison were celebrities within their lifetimes.
There are a lot of examples of famous geniuses who died in obscurity, but they almost all died young.
In 1975, my high school teacher said, contra the Romantic notion that great artists aren't appreciated within their lifetimes, that practically everybody who is famous today was famous within three score and ten years of his birth. E.g., Van Gogh only sold one painting in his lifetime, but if he could have lived another 33 years to age 70, he would have been rich.
Vermeer _might_ be a counter-example. On the other hand, he seems to have been appreciated enough in his own lifetime to have the luxury of working extremely slowly, turning out only one or two paintings per year. But then he died fairly young, and soon after Louis XIV of France attacked the Dutch Republic, which ended the Golden Age of Dutch painting as the economy shrunk.
Vermeer's repute, whatever it was during his lifetime, then faded, although Paul Johnson says that a small line of connoisseurs passed down word of Van Eyck's greatness for almost two centuries until he was generally rediscovered in the Victorian age.
Here's a way to measure this question testing a pre-selected sample: How many Manhattan Project scientists would be completely unknown walking through, say, Grand Central Station in the decades after 1945.
I'll give you my subjective opinions as somebody born in 1958 who isn't bad at remembering what was common knowledge and what wasn't. But I'm not a scientist, so the following doesn't reflect the opinion of a professional.
To the extent that Einstein was involved in the atom bomb project, yes, he was an immense celebrity, as famous of a face as Marilyn Monroe.
I think Oppenheimer would have attracted attention from a not insignificant fraction of the passers-by. He was a very distinctive looking man with a gaunt Cillian Murphy-like face (I suspect Christopher Nolan is making his "Oppenheimer" biopic to give his friend Murphy a major role.) Of course, much of his fame/notoriety derived from the hoopla over his security clearance being stripped in 1954 due to his many Communist friends.
Von Neumann was less unusual looking, but he was on TV a lot before his early death.
Fermi's picture was widely shown.
Teller was on TV a lot in the 1970s arguing in favor of nuclear power and the like.
Bohr was a giant, but I have no recollection of his face. Same for Bethe, Wigner, Szilard.
People that aren't all that famous anymore like Seaborg might have been on TV a lot.
Feynman is a folk hero today, but I can recall reading James Gleick's magnificent full page obituary for Feynman in the New York Times in 1988 and thinking to myself, "Wow, this guy was awesome, why did I never much notice him before?" Obviously, Feynman was a legend among physicists while alive, but I don't think he made much impact on the public until the space shuttle explosion hearing soon before his death. And, even then, I wasn't aware of his now legendary O-ring dunking in ice water demonstration until his obituary.
>20th and 21st century media have made it harder for geniuses to compete with non-geniuses in celebrity space.
This sounds plausible. New media (radio/TV/internet) does seem to have more of an effect on the fame of athletes, entertainers, charismatic politicians, and talking heads than on the fame of scientific innovators and sophisticated theorists. This is most likely a shift from the prior era, when fame was built on word-of-mouth and journals/newspapers.
One assumption here is that there is some limit to the number of people we can meaningfully know by reputation, at least without dedicated memorization. I guess this would be similar in effect to Dunbar's number (but presumably not similar in cause).
Scientific innovators do get some new media coverage. But the increased specialization of these innovators' fields, mentioned elsewhere in this thread, makes their achievements less comprehensible, even when explained well on slickly produced video. And so we probably retain less of what we do learn about these innovators. Which would exacerbate the new media effect.
I think the other thing is that often new scientific achievements are done in teams. Much harder to remember 5 names than the 1. Can we meaningfully remember the scientist that developed mRNA vaccines for COVID? No, because they're scientists plural.
I think there is some evidence that support the idea that ML researchers make breakthroughs at younger ages. The classic example would be Ian Goodfellow who invented GANs while a grad student. Also the Turing Award winners, LeCun, Hinton, Bengio, all did their seminal work while much younger.
I don’t buy it. This assumes the subset of the space that’s been searched is a significant fraction of the total space (even if you just consider the “easy” subset). If it’s small, you can always just move slightly to the frontier of the set and find new low-hanging fruit. There’s no reason a priori to assume that this region should not be huge.
In my area, theoretical physics, I see plenty of interesting research problems that are no more difficult than problems a generation or two ago. In many cases, the problems are easier because we have much more powerful tools.
I do, however, see the field unable to pay grad students, unable to get grant money relative to other fields, hemorrhaging good students to outside of academia, trapped in a nightmare of academic bureaucracy, and with an increasingly large number of outright crackpots.
I suppose that one of the problems with modern theoretical physics is the enormous cost of the machines necessary to experimentally confirm the theories.
Or a lack of sufficient insight and creativity to imagine low-cost experiments that could do the same. What did the microwave antenna cost, which Penzias and Wilson used to provide probably the single best piece of evidence ever for the Big Bang? Had you asked someone in 1960 what an experiment that would provide solid evidence for that theory might cost, there's a pretty decent chance he would have named some exorbitant figure -- because the idea of just pointing a big microwave antenna at an empty patch of sky hadn't occurred to anyone.
I had dinner with Nobel laureate Robert Wilson in the late 1970s. He was very modest about how he didn't know that his discovery of universal background radiation proved the Big Bang Theory until some Princeton physicists explained it to him and Penzias. Princetonians have complained about that Nobel ever since.
But as another astronomer asked me a few years ago, "Do you think it's a coincided that they gave the Nobel Prize to the best experimental radio astronomer of his generation for his greatest discovery?"
In physics I think that a large fraction of the space has already been searched. That is, for pretty much any physical phenomenon you can think of, we have a pretty good explanation. A few centuries ago a layman could sit around and wonder "What the heck is the sun?" or "What's the deal with fire?" or "Where do rocks come from?" but nowadays you can find pretty good explanations to all of these in books aimed at eight-year-olds. We've harvested pretty much the whole space of questions that an interested layman might be able to think of.
The only questions we still can't answer tend to be either (a) extremely obscure questions that only an expert could think of, or (b) possibly in-principle unanswerable like "What happened before the big bang?" and "Why is there something rather than nothing?"
I'm not sure this is true. From things I know about: somnoluminescence does not have an accepted explanation. The Russian toy that turns one way only is easy to simulate (and the simulations do agree that it spins in one direction only), but there is no model that describes the motion (without the need to integrate) in a way that makes it obvious that it will soon in one direction only.
What the heck is turbulent flow and how to predict its behavior?
What is ball lighting?
Why is the sun's corona way hotter than the surface?
Why can you wring gauge blocks?
What happens to the airflow inside a blues harmonica when you bend notes?
Why do chain fountains work, exactly?
Can you jump into a supermassive blackhole and survive, as the gravitational gradient wouldn't tear you apart? What would happen from your perspective?
While sounding simple, these are all (sounding to me) as derived and deeper exploration spaces of the much more accessible and understandable questions, like: "why do things fall" "what is water" "what is sound" questions. To see black holes we are "standing on the shoulders of Giants" and to answer many of these we need tech and knowledge invented by exploring the much more simple questions.
Barring perhaps turbulent flow, these questions seem much more esoteric than the kinds of questions that you could reasonably ask 2-300 years ago. Like, they could say, "What is lightning," you ask, "What is a super-rare kind of lightning that almost nobody has ever seen?"
That's because you are privileged with knowledge a person 2-300 years ago would have lacked. Everyone as he gains information finds the questions that puzzle someone at a lower level seem simple, and questions that puzzle him at his current level seem complex. But what's "simple" and "complex" change predictably with your current perspective, just like what's "poor" and "rich" change with your own current income.
I mean, I think it's not. Isn't my example pretty clearly an example of something that's just vastly more esoteric by any standard? We have explanations for almost everything that we commonly encounter, and the things we lack explanations for are almost entirely things that are ultra hard to observe. This was not true 300 years ago.
A set of gauge blocks is under 100 dollars, and they're routinely used by at least a 6-digit number of workers in the US. Not exactly rare.
Some of the others are certainly bad examples though. We can set up accurate differential equations to describe turbulent flow physically, but they're just not mathematically well-behaved. They're not SOLVABLE of course, as complex differential equations tend not to be, and they're absurdly sensitive to initial conditions and simulation error.
This isn't a flaw in the equations though; it's them accurately representing the state of nature. Turbulent flow itself is horribly behaved in much the same ways, and virtually impossible to force replicable flow outside of the most carefully controlled lab environment.
This. Reading up on gauge blocks, since I'd never heard of them before, is the molecular adhesion that mysterious? It seems like the sort of thing where the details might be hideously hard to compute but the gist of it is simple enough.
Well, we know the equations for fluid flow, and have done for over a century. The fact that they aren't practically solvable is the issue. This one is one I could see an "amateur" solving, in the sense that I expect advances to come either from an obscure mathematical advance (if analytic) or from an advance in computing (if a numeric approximation) rather than from physics directly.
As for black holes, that's in the "literally untestable even in theory" category - easy enough to say that you'd survive to the event horizon if the grav gradient were sufficiently low and your craft sufficiently shielded (the accretion disk gets extremely hot), but the very definition of an event horizon is that what's inside is unknowable.
The others are much further from my wheelhouse, and I haven't even heard of gauge blocks or chain fountains, which rather argues against them being the kind of phenomena laymen think about deeply.
You should check out Steve Mould's youtube channel. He's got lots of videos about this kind of physics of random everyday stuff. Sometimes he comes up with a convincing explanation, occasionally there's still a lot of mystery left about how the thing works. I bet there's at least a few videos there where a detailed explanation of exactly what's going on would be new to science.
Also: No one knows how high temperature superconductivity works, though I guess you'd say that's not accessible to laymen.
given the price of high-temperature superconductors, no, I think you need to be a fairly well funded lab to do any experiments on those.
I will, however, not that a lot of historical "lay scientists" were aristocrats who could throw rather a lot of money (for the time) at their experiments, so maybe it's not entirely fair to talk about purely financial barriers to a field?
"This assumes the subset of the space that’s been searched is a significant fraction of the total space ... ."
It assumes that searching the subset of space that has been searched has consumed all the "search energy" of the searchers to date. It says nothing about outer limits.
What's your starting point for "a generation or two ago"? I might agree that many of todays' theory papers could have been published in 1990 if the computer tech were available, but also I have the impression, in my field at least, that major theory advances have been slow since the 1960s.... (which is not to say it's the fault of theorists at all, but rather that many of their theories are extremely hard to test and there's seriously diminishing returns for the 100th untestable theory to explain something). Some of it is also just luck - Super Symmetry is a beautiful elegant theory, and when it was first thought of it seemed quite plausible that it would be true, but when the LHC finally got running, there was no sign of any SUSY.
Scott, I think this model has less explanatory power than your previous* model, because it fails to account for discoveries which make other discoveries more available. For example, had Newton invented calculus but not the laws of motion, this would have reduced the depletion of the forest in that area, because some things which *could* be discovered without calculus are much easier to discover with calculus. Maybe you could throw in something like builders (teachers in real life) who build roads which make other areas easier to reach?
The point of this is that a more innovations in whatever makes things easier to understand, maybe educational psychology (if its effective at making better learners at scale, which idk) will reverse this trend, and the model should have something to reflect that
I agree with this argument. Scott's model assumes scarcity - scarcity of fruit, scarcity of knowledge, scarcity of science. However, a dynamic that may apply to fruit, might actually be inverse for science/knowledge. Knowledge begets knowledge. It's a tree that continually branches ad infinitum. 1000 years ago, medicine was a shrub, whereas now it's a massive tree the size of a jungle (sorry for the sloppy metaphors). An invention such as CRISPR or MRNA technologies, opens up new frontiers for tens? hundreds? thousands? important inventions.
Taking educational psychology as an example - Piaget may have made quite important discoveries, but the "science" or the "knowledge tree" of educational psychology is still a shrub. Perhaps it will be a jungle in a few hundred years and everything we think and do vis-a-vis development/education will be different. If important discoveries in educational psychology are made, they are not decreasing the discoveries to be made, but organically? increasing them.
An excellent point. If this branching tree metaphor is better, and I think it is, we might expect some branches to crap out earlier than others (so we get all the breakthroughs and know everything about the field in that branch) but there should be ever more questions to answer, and more discoveries to be made.
If I were to try and tie it back to Scott's metaphor, advancing technology should allow for more efficient searching over time. You build roads (someone mentioned this above) you get horses, you stop going towards places that just don't seem to have more food. This should allow you to access increasingly more food with even a linear progression in how fast you can cover distance because the area of the circle gets bigger.
Of course, Scott's model doesn't include things like "Keep going over the same barren, dead end ground because someone wants to pay you to cover that ground, because they want that ground to be true." Sort of the normative sociology that economists make fun of: the study of what should be causing society's ills. Most sciences that even tangentially relate to public policy start to fall into this trap.
To apply Scott's metaphor, scientific discovery is really all about the mushroom caves, not the flat ground. Random events or quirky observations open up whole new approaches to understanding the universe, which then quickly get fully explored by more workmanlike research. This is basically Kuhn's model, and I'm not sure why Scott tends to ignore it--especially since it helps explain why the enormous edifice of the professional scientific community, which is primarily concerned with picking over known ground, produces fewer breakthroughs than its size would predict under Scott's "foraging" model.
I only read about the first third of Kuhn's big book about Paradigm Shifts, but I came away with the impression that, to my surprise, its celebrity is deserved.
These metaphor extensions are fine, but the core virtue of Scott's metaphor (scientific progress ~ picking reachable fruit) is that it's very simple and yet generally fits the data on how our scientific knowledge grows. Or are you suggesting that there is some systematic data that it fails to predict well?
Yes, it fails to predict the actual pattern of scientific advancement, which more closely resembles a series of random bursts of varying size, depth and speed than a slow, steady march.
I think there's an underappreciated aspect here which is that lots of advances may be happening but it gets harder and harder for Scott (and other laymen) to appreciate them.
My subfield was effectively inaccessible until the last century, and my impression is that most scientific fields are similar. No one is asking questions that were conceivable to ask 100 years ago, maybe even 50 years ago. In the metaphor, we set up new cities close to the frontier, and get better and better railways over time. By the end of grad school, you've mastered a multitude of ideas that are pretty new and unexplored, because once someone figures something out it they can teach others.
So this model feels wrong from inside. I'm not constantly walking over already-known things, I learned the modern stuff in school and now everywhere I look is un-tread territory.
But that's a subfield. If you look at math as a whole, there was more big re-defining work in the 1800s than the last 100 years. So to me, things are getting shaken up and expanded all the time, but if you lack the expertise to understand individual subfields, it's hard to explain what new things are happening.
It's unclear to me if the laymen are too far away to see the truth, or the experts are too close to see the truth.
It's often said, and seems objectively true, that far more math has been invented in the last 50 years than the rest of human history put together. One reason for that is there are massively more mathematicians in the world now, for many reasons. So something is happening, the question is how to value it and how to value the people who do it.
I don't recall hearing about Alan Turing and John Nash when I was young, but today, Turing and Nash are the heroes of Academy Award nominated movies. So, fame progresses. There are probably brilliant people born in the 1990s who are little known to the general public today, but who will be folk heroes in the second half of this century.
I think some of it is the distance between novel science and practical application, which has a large time lag and a large component of luck. Maths in particular seems to have a habit of going from obscure trivia to pivotal bedrock hundreds of years after discovery, which of course if good for the fame of long-dead mathematicians and terrible for the fame of currently active ones.
Plus, some of the early geniuses may well be “names associated with” rather than “sole inventor of.”
For example, despite improvements in the history of science, I bet there were still some husband and wife teams where only his name is remembered (at least in popular culture).
Or Darwin: clearly his ideas grew out of the shoulders of the giants upon whom he was standing, that’s why other people were able to come up with them as well. But we don’t remember the names of those other guys. Similarly for Newton/Leibniz: sometimes the genius halo grows more out of our desire to have a single historical hook on which to hang our story of scientific advances, rather than a deep understanding of the science process.
And if our perception of past genius is distorted by the lens of history, then our comparisons with current geniuses will be less accurate.
And the mechanism in question was actually discovered by Anderson several years before Higgs. Higgs himself credited Anderson for the mechanism. But (a) Anderson already had a Nobel prize and (b) he had made himself persona non grata among high energy physicists over the Superconducting Supercollider affair, and no way they were going to let `their' prize go to Anderson...
of which, IIRC 3 shared in the prize? at least one was dead by the time it was awarded, which is not that surprising when there's 60-odd years between the prediction and the experimental confirmation.
This model seems a bit oversimplified in two important ways.
1. Ideas don't really "deplete" like this. Say you come up with some good ideas around factoring prime numbers. Someone else invents the computer. A third person puts them together and gets RSA. All three of those are good valuable work, but I wouldn't think third idea was "further out" than the first (in terms of how long it would take to get there). It was just gated on the computer.
Lots of ideas are like this -- simple, but dormant until the other necessary ingredients are ready.
2. The campsite "moves" over time. A whole lot of our cognitive technology is encoded deep in our language, tools, norms, etc., and isn't fixed year over year. Even if today's people and yesterday's people could travel the same distance on average, today's people would still be biased to discovering new things -- just by virtue of starting off somewhere else.
Some of this technology is more literal: computers are something like a bicycle in this metaphor. The early astronomers were analyzing data by hand!
Point 2 is pretty important. Tools develop. But not all of these are linear: needs develop, too, changing what tools we _think of_.
Analogy, we tame pack animals and horses, and carry more and further. Eventually we develop agriculture. We grow big trees, but now to get more fruit, we need to go higher... going further is not an option.
So someone develops a ladder. But someone could have developed a ladder at any time, if they had spent time solving the problem of climbing higher rather than solving the problem of walking further and carrying more. Differing needs can give us insight into spaces that were always there.
Machine learning is definitely still one of the low hanging fruit areas. In this case, you can turn a drug discovery ML system into one that can discover VX nerve gas and a whole new, exciting range of chemical weapons just by inverting the utility function....
Worth noting that the objective function for a machine learning algorithm that finds nerve gas is not in any sense the inverse of one that finds a drug. Nerve gases ARE drugs, from the perspective of a machine learning algorithm. Just drugs with specific endpoints, a high premium on low-dose requirement, and a different set of example drugs to work from.
(Forgive me if this point has been made already, I'm writing this comment quickly)
I've been thinking about this a bit recently because I'm trying to write a piece about a related topic (and I listened to this interesting BBC documentary https://www.bbc.co.uk/programmes/m0015v9g on the slowing of science and progress). There's another mechanism which you don't model here: in the foraging model, finding fruit only makes it harder to find new fruit. But in science and tech, a discovery or invention makes future discoveries or inventions easier.
For instance, a wood-handled flint axe is a combination of two earlier inventions, the stick and the hand-axe. Newton's observations about gravity are possible because of the earlier invention of the telescope. The invention of the iPhone 13 is possible because of the earlier invention of the [various things, transistors, touch screens, etc].
So there's a countervailing force: individual discoveries become harder *given a base of zero knowledge*, but there are also new discoveries that become possible because they are simply combinations of earlier discoveries (or new technologies make them more accessible).
In your model it might be more like you're loggers, rather than foragers, and cutting down some trees allows access to new trees, but somewhat further off? I don't know what the equivalent of height might be, but perhaps strength.
I think you don't distinguish inventions (technology) from discoveries (science) enough. The examples you give are mostly technology, where indeed depletion is less evident, because the fact that new inventions must be more complex (the simpler ones are done) is compensated by the fact that new invention are often a combination of existing elements, and progress make more/new elements available to combine.
For discoveries, it's not exactly the same. Progress is usually a way to make new observations available, but that's only part of scientific discovery: it helps discriminate between competing theories or show that an existing theory is not adequate and maybe hint at possible evolutions/replacements. Progress is like having new ways to reach the frontier faster, but which you also have to learn: having a car will sure makes you faster, but you also spend time learning to drive.
So I think there is indeed a lowing fruit/foraging ground exhaustion effect, especially visible in non-combinatorial fields (like base tech (energy prodcuction for example) or fundamental science (physics is a prime example)
oh I definitely think the low-hanging fruit phenomenon *exists*. But I think there's a countervailing force, of previous discoveries/inventions making new ones possible. I didn't distinguish between the two very much because I don't think the distinction is hugely important - Maxwell can't come up with the equations describing electromagnetism without [consults Wikipedia] Ampere's work on electrodynamics and a hundred other people. (Newton's "if I have seen further it is by standing on the shoulders of giants" quote seems relevant here.)
> Let’s add intelligence to this model. Imagine there are fruit trees scattered around, and especially tall people can pick fruits that shorter people can’t reach. If you are the first person ever to be seven feet tall, then even if the usual foraging horizon is very far from camp, you can forage very close to camp, picking the seven-foot-high-up fruits that no previous forager could get. So there are actually many different horizons: a distant horizon for ordinary-height people, a nearer horizon for tallish people, and a horizon so close as to be almost irrelevant for giants.
Well, there's the famous observation that war and strife created da Vinci and Michelangelo, while hundreds of years of peace in Switzerland could only create the Swiss clock
That's an interesting viewpoint, and I realize I hadn't analyzed my own viewpoint before.
1. Hunter-gatherers were surely always at war and at great personal risk. They didn't have any markable scientific progress for millennia. This supports your point
2. However, the Second World War can be said to be directly responsible for the advent of the computer, atomic bomb, radar, etc.
Is it possible that if hunter-gatherers only had wars every 5 years or so, and not every week, they would put in a lot of resources into developing new weapons, thus heralding the technological revolution at an earlier date? Is it also possible that if the Second World War had been much shorter, say 1 year instead of 6, a lot of the present-day technology would never have been developed? It seems to me that for technological progress, we need urgency in the form of war, but also slack in which we can play around with crazy and unknown ideas to develop the best inventions/weapons.
There's also the question of *refinement*; you can be a Great Genius of your period, but if the technology isn't up to it, then what data you can gather and what experiments you can perform and what working devices you can create are limited.
The Michelson-Morley experiment was important because it disproved the theory of the ether and in turn kicked off research that would eventually develop into special relativity. But there hadn't been sufficiently precise instrumentation to do such an experiment before then; same with measuring the speed of light, etc.
Your hunter-gatherers can be smart and innovative, but there is only so much they can do with their first tools, which have to be worked on to produce better tools, so that more ore can be mined and smelted, and smelting process itself improved, so eventually you can manufacture steel and then you're going places with what you can make, how precise it can be, and how flexible and useful.
The purported lack of genius today may be down in part to something as simple as "we're still working with bronze implements, we haven't even got to steel yet".
There may be a significant emotional difference between being attacked by enemies and genocide by your own (or nearly your own) government. In war, you have enemies. In a genocide, you can't trust your neighbors.
One thing that needs to be explained is that, when Nazi Germany fell, Jews still existed. A lot of the geniuses from the 40s still existed. Maybe it was a specific sort of of schooling that went away-- institutions can be smashed.
It's possible that it's not that the level of accomplishment has dropped, it's just that the publicity machine for calling geniuses isn't working as well.
Or it's possible that a there's a level of trauma which needs to fade.
On the art side, I suspect that a lot of creativity is going into gaming. Someone could be an amazing dungeon master, but their art is personal and ephemeral and they aren't going to be picked out as a genius.
Video editing is a new art form, and it can be popular, but no one takes it seriously the way older art forms are taken.
Leonardo did work as a military engineer and architect for several patrons. We think of him mostly as an artist, but he would have been expected to - and was very much capable of - turn his hand to anything. From Wikipedia:
"Leonardo went to offer his services to Duke of Milan Ludovico Sforza. Leonardo wrote Sforza a letter which described the diverse things that he could achieve in the fields of engineering and weapon design, and mentioned that he could paint. ...When Ludovico Sforza was overthrown by France in 1500, Leonardo fled Milan for Venice, accompanied by his assistant Salaì and friend, the mathematician Luca Pacioli. In Venice, Leonardo was employed as a military architect and engineer, devising methods to defend the city from naval attack. In Cesena in 1502, Leonardo entered the service of Cesare Borgia, the son of Pope Alexander VI, acting as a military architect and engineer and travelling throughout Italy with his patron. Leonardo created a map of Cesare Borgia's stronghold, a town plan of Imola in order to win his patronage. Upon seeing it, Cesare hired Leonardo as his chief military engineer and architect. ...In 1512, Leonardo was working on plans for an equestrian monument for Gian Giacomo Trivulzio, but this was prevented by an invasion of a confederation of Swiss, Spanish and Venetian forces, which drove the French from Milan. Leonardo stayed in the city, spending several months in 1513 at the Medici's Vaprio d'Adda villa."
Orson Welles' famous comment on the Swiss could be quantitatively tested.
My impression is that the Swiss are reasonably accomplished, but that, Switzerland lacking huge cities, they typically accomplish their peaks in other countries. E.g., Rousseau became the most famous Parisian intellectual of the second half of the 18th Century, Einstein created the theory of general relativity in Berlin, Euler and some of the Bernoullis spending years in St. Petersburg.
It's a little like West Virginia: West Virginia is lacking in celebrities, but California was full of West Virginian heroes like Chuck Yeager and Jerry West.
On a single small point : "Since a rational forager would never choose the latter, I assume there’s some law that governs how depleted terrain would be in this scenario, which I’m violating. I can’t immediately figure out how to calculate it, so let’s just assume some foragers aren’t rational".
Isn't there a question of personal preferences and aptitude? Sure, it'd be more productive to go over there but I happen to really like it here and foraging that particular ground makes me feel competent while going over there is arduous for me.
Hence even if it would be more 'rational', I'm not going to do it. 'Irrational' is an acceptable descriptor for that behaviour in economics, but it may not be quite 'irrational' in everyday parlance, it's just optimizing for different objectives.
Let me give an epistemic reason for the stall. There’s a clear barrier to recent progress of the traditional kind, which is (to use the jargon of my colleagues in Santa Fe Institute) complexity.
Complex systems are not amenable to the Francis Bacon style “vary and test” experimental method. We’re learning a huge amount but returns to experimental methods of the causal-control kind are hard to come by. Taleb is a good example of a person — a real, no-BS practitioner — who recognized many of the same things the SFI people did. In a funny way, so was David Graeber.
Examples of complex systems include the human mind and body; hence why we’ve had so little progress in getting control of (say) depression or cancer, and why the human genome project fizzled after we found the 1-SNP diseases. Much of Econ is similar (IMO the RCT era is overblown). ML is CS discovering the same.
They’re hard problems that will require a new set of tools, and even a new Francis Bacon. The good news is that I think we will crack them. We stumbled on this world in the mid-1980s, but IMO didn’t get serious until the mid-2000s.
Agree with everything you say here for whatever it’s worth. We need a new kind of flashlight to see ahead and specifically a flashlight for complex systems.
What about education? Think the day away/teleport thing might break on this one. The tribe writes down descriptions of very distant places in very exacting detail and if a student spends ten years studying it they can get there instantly vs say two hundred years if they tried to go it alone. Or do we define the day as what is possible to achieve even with education?
Other interesting thought is artifice. One day the tribe invents a car. I mean that literally in this analogy, although maybe microscope is better. Or stilts or a shovel or something in this analogy? The mere addition of tools that allow you to reach greater depths or heights causes the depleted land to have new bounty. Some of those technologies exist farther away.
I like this a lot overall. I have a similar analogy about light houses I use.
In some sense, a lot of progress in science can be thought of as "getting closer to the truth" than "finding new terrain". "Getting closer to the truth" comes from "change of perspective". This change of perspective mostly comes from new technology or observations, like the Morley-Michelsen experiments, which gave rise to Relativity, or other experiments that led to Quantum Physics. The age of the scientists is generally irrelevant. Physics was many hundreds of years old when Einstein and Dirac, young scientists, made their discoveries. Although they may in themselves be giants, it is difficult to argue that such giants don't exist at all today in terms of sheer intellect and hard work.
Hence, I feel that point no 5 and confirmation bias can explain a lot of this. People learn a paradigm, and try to stick very hard to it, until new technology makes experiments possible that clearly contradict those paradigms, causing paradigms to change. The first scientists to then discover those changed paradigms that accommodate the new experimental results become heroes.
Let's revisit this issue when you've got more data about the scientists, so that you can concentrate on that instead of elaborating the already-clear forager metaphor and then shrugging your shoulders over the real question.
This is really pessimistic without the last part - that at some point, the foragers manage to set up camp in another part of the forest, acquiring untold riches at first, then letting others set up even further.
This is what happened with machine learning, with biotech (next generation sequencing, anyone?), in fact a lot of science is about this kind of camp-setting. "Standing on the shoulders of giants", and it's giants all the way down/up.
There is a huge difference between having to figure out calculus from first principles, and learning it in high school then moving on to something cooler. And then you can have the computer calculate your integrals for you, with calculus relegated to your "maybe figure it out someday when it's needed" pile. Knowledge is a tool for acquiring further knowledge.
As I say in the original essay on genius, I think it's true that "ideas are getting harder to find" (what you call the "Low-Hanging Fruit Argument"). It's also empirically supported by looking closely at things like agricultural yields. The question is just whether it fully explains the effect, or even most of the effect, and there are reasons to doubt that. For example, the two reasons I give in the original essay to be skeptical are:
(a) if the lack of genius (or let's just say "new ideas") is due solely to ideas getting harder to find, then it is an incredible coincidence that, as the effective population of the people who could find such ideas exploded to essentially the entire globe (with the advent of the internet and mass education), ideas got harder to find to the exact same degree. In fact, this looks to be impossible, for there should have been "mining" of the idea space in order to quickly exhaust it, and which would have triggered a cultural golden age. It is on this question that the original essay starts, but I've never seen anyone address how changes in effective population should have led to more "finding" and that doesn't look like what we see.
(b) “ideas are getting harder to find” seems especially unconvincing outside the hard sciences in domains like music or fiction. I actually still think there is some truth to it - you can only invent the fantasy genre once, and Tolkien gets most of that credit. But overall it seems obviously true that something like fictional stories aren't as directly "mineable" as thermodynamical equations. And yet, again, we see the same decline in both at the same times, so the explanation needs to extend beyond the hard sciences.
How much of the entire globe has supporting infrastructure (physical and cultural) that makes these places a viable location for research? If we moved a researcher from Switzerland to Kenya or Bangladesh, how would it affect their output?
Surely it might, but perhaps less than one might expect - there are some great universities in India! But consider just within the US: it used to be that only the kids at MIT got the MIT lectures. Now *anyone* can get the MIT lectures. It used to be that only the kids at Harvard got access to the best math teachers. Now there's Khan academy. Most scientific papers can be found online, even if you don't have institutional access. There's thriving intellectual communities on blogs and in forums and places to post contributions at zero-cost, if you have them to make. Not to mention the mass education - just look at how many new foragers there should be now that racial and gender barriers have been significantly decreased and college is basically mandatory for a huge swath of Americans! An explosion of foragers, and the ease of foraging massively increased, all within a few decades. And yet, and yet.
Khan and MIT lectures are great fallback resources, or adequate resources to get an understanding somewhere between a high school student and a really bad undergrad.
I tried learning molecular biology as an amateur before actually returning to college for it (perks of the free, if substandard, education in my country). Maybe I could pull it off if I was a supergenius, but realistically I'd have an extremely fragmented and ungrounded understanding. So for my first contribution to the field, I'd need to:
- read papers en masse where the only layman-accessible words are prepositions
- understand the context of those papers to know why they're doing what they're doing, and why it's important for the field, and why couldn't it be done in another easier way
- identify limits of the state of the art
- come up with a good idea for a novel contribution, by myself, without asking people who already grapple with these problems
- find collaborators and convince them my idea is good and I know my shit
- write the paper
- get it accepted in a journal, or at least self-publish on biorxiv and make a considerable number of domain experts read it
God forbid I need actual wet experiments and funding - that's just the list for pure in silico work!
At any point I can make mistakes - I don't have any experts to look over my shoulder and tell me when I made a mistake. I'd spend years of my life working on a project that is completely irrelevant and either someone did it better five years ago, or it's just something that nobody does because there is no point to it.
>- get it accepted in a journal, or at least self-publish on biorxiv and make a considerable number of domain experts read it
Isn't this exactly the problem caused by greatly increased amount of people involved in science? Too many people publishing so much that nobody can read it all and judge it on its merits. Consequently everyone falls back into social games of knowing the right tastemakers and having access to the right connections. Instead of a democratic/capitalist marketplace of ideas, it becomes an aristocratic society reminiscent of an early Victorian novel. ("Aristocratic" intended as a slur, not its literal meaning.)
A century ago a scientist could have a very good grasp of their field by reading a handful of journals relevant to their discipline. Two centuries ago, you'd do well by reading a single journal (Philosophical Transactions).
"An explosion of foragers, and the ease of foraging massively increased, all within a few decades. And yet, and yet."
Okay, I'm gonna bite here. Take one of your geniuses of the past, plonk him down today, and see if he still counts as a genius. Strip away the mythos of Einstein, take his work from youth to maturity, and compare it with people working in the same field today.
Would Mozart be considered a genius? Maybe. Or maybe he would go into composing movie scores which is lucrative and steady work, and nobody would talk about him as a great genius, even if he radically transformed how movie scores are written and used.
The mark of the past genius should be that they could still come up with novel concepts and discoveries even after absorbing all the progress made in the field since their day. But could they? Would Einstein be able to leap ahead with new ideas, or would he have reached his limit at what are now "yes, this is part of the field everyone has to know" ideas?
I do think there are natural human limits. It may be that we are hitting up against them, and that the intellect sufficient to be a revolutionary genius in the low-hanging fruit days is not sufficient to be the same in days of scarcity of fruit. It could well be that in ten years time somebody comes up with something new and unprecedented which is indeed unforaged territory, and the corresponding Great Men will flourish in that field. Giving up on "where are all the geniuses?" right now seems premature to me.
"there should have been "mining" of the idea space in order to quickly exhaust it, and which would have triggered a cultural golden age."
What makes you think this is not a cultural golden age? How would you define a cultural golden age? As we are constantly reminded, no other society in the history of the planet has been as rich, educated, and living good lives as we are. We have everyday devices that were the stuff of science fiction even thirty years ago, and our poor can access some model of those. We have the Internet, we have access to more information quickly, easily and cheaply than any culture before has ever had. Diseases and health problems that would have been death sentences are now curable with a course of pills or surgeries. Ordinary people have access to resources to be creative that even the great craftsmen and artists of former times could not have dreamed of.
The complaints seem to be along the lines of "where are our colonies on Alpha Centauri?" which, when you think about it, can only be the kind of complaints from a rich, successful, technologically advanced society accustomed to constant progress.
(I'm not saying we are in a cultural golden age, just that golden ages tend to be recognised by looking back to the past and saying 'ah yes, that was a time of wonders'. What will our descendants a hundred years from now think - will they talk about our golden age?)
I don't think most people credit Tolkien with "inventing" the fantasy genre since most everyone knows it existed before him. It's just that nearly everyone since has been writing in his shadow.
So far as I know, Tolkien didn't invent the fantasy genre. He invented, or at least popularized, serious world-building, which has become a dominant part of fantasy.
It depends on what you mean by "the fantasy genre". I believe that fantasy is a human norm for fiction, and if anything was invented, it's the idea that respectable adults shouldn't want fantasy.
And then.... there was Tolkien and Star Wars and the return of the repressed desire for fantasy.
The low-hanging fruit argument seems to me very probable, and I love the metaphore with real fruits in it!
I would like to add a small (and quite optimistic!) additional hypothesis concerning the decrease of the observed frequency of geniuses, this one in relation with the increase of the population and its level of education.
If we assume that we recognize someone as a genius when he or she clearly surpasses all the other people in his or her field, that it is therefore mainly an evaluation that is relative, being done by comparison with what other people produce at a given moment in the field in question. In this case, the fact that the population as well as its level of education is increasing must also very significantly increase the number of people working in a field . And in this case, it seems to me that statistically, the probability that the most talented person in a field is much more talented than the second most talented person in the same field, is probably much lower than before.
Therefore, we would have difficulty recognizing contemporary geniuses partly because there would be many people doing extraordinary things in general, whereas before there were a few who stood out.
I like the rough model but i'd point out that there are certain topological assumptions being made, which maybe don't apply. If 'places of insight' were arranged in some Euclidean geometry, then your theory holds.
But if we generalize it to "finding new knowledge require _either_ walking new ground, or exceptional talent" (which i think is totally fair), we might ask whether it's possible to walk new ground via nontraditional approaches. If the _only_ dimension we consider is 'angle and distance from the base camp', i.e. the territory is 2-d euclidean grid, and we've mapped out where everyone has and hasn't walked, then it becomes much less likely you will _find_ new ground immediately around the camp.
But if the number of dimensions is so high that most people don't even _see_ a bunch of dimensions, then we might actually expect _creativity_ to lead to insights more readily than intelligence.
Or, if technology+economics have changed in such a way that someone might have 10 different mini-careers and still acquire sufficient wealth to do as they please , this might _also_ be 'new territory' where discoveries become easy. So we might expect future discoveries to be more likely from, say, a startup employee turned venture capitalist turned armature horticulturalist turned poet turned botanist, who synthesized a bunch of experiences that many other people had, _individually_, and yet nobody had yet had _collectively_.
The fruit-gathering analogy might work if someone is the first person to circumnavigate the camp at a specific radius, and to spend at least a few weeks at different angles at different times of the year. They might notice some seasonal continuity between plants growing only at that radius, which might only be observable to someone who had spent the right amount of time in all of those places. In terms of ground, they haven't covered anything new. But if we include time in there, then yes, it's like they _did_ walk on new territory.
So i like the theory if we generalize it as "to maximize your chance of discoveries you have to walk on ground nobody else has walked on before", but it's worth asking whether "the space of being an academic researcher" being extremely well-trodden means that there aren't low hanging fruit in dimensions none of us have even considered looking in.
Like, for all we know, just breathing weird for like 7 years straight could let you levitate and walk through walls. How would we know if this were true? Suppose someone discovered it 10,000 years ago, and they did it, and everyone was like 'holy shit that's crazy' and they wrote stories about it, and today we dismiss those because they are obviously absurd. Are _you_ willing to spend seven years chanting some mantra on the off chance that maybe it'll let you walk through walls? I'm not. Probably most reasonable people aren't. That's some unexplored territory right there! But something tells me it probably isn't' worth the effort.
And yet people like wim hof exist. This tells me there's probably a ton of low hanging fruit still around but it'll be discovered by eccentric weirdos.
Is there any reason to assume science fruit space is even Euclidean?
As players of https://zenorogue.itch.io/hyperrogue know, in hyperbolic space you can fit huge worlds so close that you'll get from anywhere to anywhere else in a few steps. The problem is just knowing the way.
I'm not sure about "taking more time to reach the frontiers of knowledge". Bachelor's degrees haven't gotten steadily longer over time, and previous key discoveries get built into the curriculum. The length of postdocs has (particularly for those eyeing an academic career), but that has more to do with the often enormous quantity of work required to get your Nature Something "golden ticket" paper. Once you start grad school you're basically teleported to the frontier. People learn and adapt quickly.
I think genuine breakthroughs happen on a more regular basis than people think, but we've pushed the depths of knowledge so deep that they're not necessarily recognizable to an outside observer.
I’m not sure about other fields, but I can say that it definitely takes another 2-3 years to hit the frontiers after a BA. This might vary depending on your subfield, but in analysis it definitely can
I really enjoy analogies so thank you for writing up this very thoughtful and entertaining model. I think there's another thing at play, which is distraction. I'm not as talented a writer as you, so instead of clumsily trying to extend the analogy, I'll tell a some stories about my own medical school class.
I went to a well regarded medical school with lots of brilliant and talented classmates. I will say there was a big difference in how much each of my classmates were excited by discovery, and interest in discovery was largely orthogonal to pure intellectual horsepower. Some of the smartest people I've ever met had exactly zero interest in research--they jumped through the appropriate hoops to get residencies and fellowships in lucrative fields and now enjoy lives where they are affluent, enjoy high social status, and do work that they find interesting enough. I think some of these folks are "potential geniuses" who made a rational choice to take a sure thing (a career as an orthopedic surgeon), over something more volatile (a career doing research in the life sciences).
To give an example of the same effect, working slightly differently, a friend of mine told me that he had taken a job as an investment banker right after college, and then was laid off before he could start working due to the financial crisis. He came to medical school as a backup plan, and is now an extremely talented epidemiologist.
Final story is about a friend who, while he was a post-doc (MD PhD), realized it made much more sense to moonlight as a doctor and pay other post-docs (who were PhDs and didn't have the more lucrative option of taking care of patients) to execute his experiments for him. This was kind of a boot-strappy way of leveraging the resources around him. But I tell this story is because he had to make science more a passion project funded by his actual lucrative career which was as a physician.
What I take away from these stories is three things:
1. It doesn't really make a lot of sense to study the sciences (especially at a fundamental, basic level that is most likely to create groundbreaking discoveries) if what you care most about is a comfortable or happy life. True, the rewards are enormous for the right-most outliers, but most people work very hard for tiny material rewards, when they're usually clever enough that they could have nicer lives in most other careers.
2. Having a successful career as a scientist is HIGHLY path dependent. You have to have the right sequence of experiences that give you more and more momentum, while also not having experiences that pull you off your scientist path onto more lucrative or comfortable paths. This is a point that's been made MANY times before but I wonder how many potentially great thinkers over the last 30 years have pursued careers in management consulting, banking, or dermatology, or orthopedic surgery. Obviously these people still make potentially great contributions to society in the roles that they take but they are much less likely to expand the frontiers of human knowledge.
3. We still probably undervalue most research, as a society. Because the potential payoff is so uncertain, individuals have to bear a lot of the risk of these careers. There's an enormous opportunity cost to getting a PhD and doing a post doc, and even if you are one of the few successes that gets your own lab, it's still a not a very materially rewarding situation. So what you end up with is a) a lot of talented people who bail on their science careers for things that are more of a sure thing and b) a lot of people who never consider a science career because it represents a high-risk, low-reward scenario compared with the other options in front of them.
For the sake of argument, let's grant that your argument as presented is 100% correct. Even so, outsized focus on the political aspect is right and proper because unlike the mechanical causes we have some small hope of changing the politics. Instead of "there's no mystery to explain here" the takeaway could be "we need to run a tighter ship of Science, the deck's stacked against us".
Perhaps a more apt analogy for science is not picking fruit, but planting fruit trees. Planting a fruit tree suggests a scarce return in the short term, but the returns can expand organically in two ways: as the tree grows, and as the seeds from the tree spread to sprout other trees. So, a single planted tree has the potential spawn an entire ecosystem. Similarly, knowledge begets knowledge.
"machine learning should have a lower age of great discoveries."
Possibly controversial opinion but machine learning is a technological field, and not a scientific one... or rather -- none of the science is novel. The advances in machine learning are a combination of the scale afforded by modern hardware, vast amounts of data, and statistical and curve-fitting theories that have been around forever. The big issue with regarding it as a scientific field (for me) is that they aren't coming up with new principles as such, they're coming up with a set of techniques to accomplish tasks. And in general they have no idea how these techniques actually accomplish these tasks -- the loop is generally suck-it-and-see; hence all the pseudoscience surrounding it and baseless claims that brains work like neural nets, or that sexual-reproduction works like drop-out, and so on.
Another factor is that to make a discovery in machine learning, you need to spend a lot of money on compute, and a lot of money on data (or have an agreement with some company that already has tonnes of it) -- so this also favours established people.
Finally, advances in machine learning are consistently overstated. GPT-3 already absorbs more content than any human has ever absorbed; and people are amazed that it can muddle through tasks that are simple for a human child with a fraction of the compute, or training data. Also, there's a bit of Emperor's Clothes about this stuff. One of the useful things about human cognition is that you can tell a human "hey, there's this interesting thing X" and the human can quickly assimilate that into their model and use it. For example, I can give you a slightly better method for multiplying numbers, and you can apply it pretty instantly. This is what "learning" usually means for human cognition. You can't explain to GPT-3 a better method of multiplying numbers. And there's no mechanisms on the drawing board for how to do it. Sorry this is a bit of a rant, but in my real life I'm surrounded by people who think GPT-3 is basically a human brain and it drives me nuts.
I think you need a different model for science and technology? As you say, physics seems to have stagnated, but our technology continues to advance; cosmology continues to advance, but space faring technology regresses; scientific knowledge about birth advances, but outcomes of birth in terms of morbidity and cost declines. For software and engineering, the science for which continues to advance, but the technology declines (see Collapse of Civilization by Jonathan Blow).
I'm pondering if this is related to scientific discoveries too. Since geography varies, and science fields vary also, I think there's some merit here. One forager my specialize on muddy seeps, whilst another may focus on the banks of larger rivers, another robs the nests of cliff dwelling birds. Each would find different resources, one comes back with a fish, the other with cattail roots & duck eggs, another with swallow eggs and nestlings. Likewise in science, someone plays at melting things in the furnace, someone plays with light and lenses, another ponders infinite series.
"Some writers attribute the decline in amateur scientists to an increasingly credentialist establishment"
I suspect that one reason for the credentialist establishment is that it takes many years to reach the state of the art in knowledge, and non-rich people can't afford to spend that many years studying rather than working. The longer it takes to reach state of the art, the more money has to be spent getting that student to the state of the art, and the greater the need for a bureaucracy to decide who gets it and who doesn't - and bureaucracies run off credentials.
One reason I think that the UK is overrepresented in scientific research is that our education specialises earlier than most other countries, which means that, at the expense of a broader education, Brits can reach the state of the art several years earlier than Americans (the average age at PhD is 29 vs 33).
If true, this is an excellent argument for letting gifted kids specialize earlier, while the whole educational community is pushing for a longer period of general education. There are other considerations here - maybe children with a more general education are more likely to lead happy lives and it's worth sacrificing a few potential geniuses to the gods of mediocrity to make that happen.
But if so that just brings us back to Hoel and the idea that an education that is personalized and one-on-one is just vastly superior to our system at cranking out revolutionary thinkers.
I guess if you find yourself burdened with a precocious progency the strategy is get 'em young, find someone who can cultivate their strengths, and try to keep the truancy officer away long enough that they aren't forced to spend 6 hours a day proving they're reading books they already read.
Mostly off-topic but fun and kind of instructive game: Imagine what a modern education looks like for geniuses of the past. What was Oscar Wilde's mandatory elective? What does a paper about the Themes of the Scarlet Letter with an introduction paragraph, at least three body paragraphs, and a conclusion paragraph look like if written by Newton or Einstein?
Classics, if this Wikipedia article is correct (I read some anecdote years ago about one of his Trinity tutors saying, about Wilde's success in England, "Yes, it was better for Oscar to go there, he wasn't quite up to the mark here"):
"Until he was nine, Wilde was educated at home, where a French nursemaid and a German governess taught him their languages. He joined his brother Willie at Portora Royal School in Enniskillen, County Fermanagh, which he attended from 1864 to 1871....He excelled academically, particularly in the subject of Classics, in which he ranked fourth in the school in 1869. His aptitude for giving oral translations of Greek and Latin texts won him multiple prizes, including the Carpenter Prize for Greek Testament. He was one of only three students at Portora to win a Royal School scholarship to Trinity in 1871.
Wilde left Portora with a royal scholarship to read classics at Trinity College Dublin, from 1871 to 1874, sharing rooms with his older brother Willie Wilde. Trinity, one of the leading classical schools, placed him with scholars such as R. Y. Tyrell, Arthur Palmer, Edward Dowden and his tutor, Professor J. P. Mahaffy, who inspired his interest in Greek literature.
...At Trinity, Wilde established himself as an outstanding student: he came first in his class in his first year, won a scholarship by competitive examination in his second and, in his finals, won the Berkeley Gold Medal in Greek, the University's highest academic award. He was encouraged to compete for a demyship (a half-scholarship worth £95 (£9,000 today) per year) to Magdalen College, Oxford – which he won easily.
At Magdalen, he read Greats from 1874 to 1878, and from there he applied to join the Oxford Union, but failed to be elected.
While at Magdalen College, Wilde became particularly well known for his role in the aesthetic and decadent movements. He wore his hair long, openly scorned "manly" sports though he occasionally boxed, and he decorated his rooms with peacock feathers, lilies, sunflowers, blue china and other objets d'art. ...Wilde was once physically attacked by a group of four fellow students, and dealt with them single-handedly, surprising critics.
...In November 1878, he graduated with a double first in his B.A. of Classical Moderations and Literae Humaniores (Greats). Wilde wrote to a friend, "The dons are 'astonied' beyond words – the Bad Boy doing so well in the end!"
Classics are a good choice for a writer/artist - My high school offered band, orchestra, drama, show choir, and home ec. I was not an exceptional student and it wouldn't have occurred to me to ask for a classics elective, but if some incredibly talented young person had, I suspect the folks there would have needed to look it up.
"the strategy is get 'em young, find someone who can cultivate their strengths"
How young are we talking, and how specialised? Suppose little Johnny is good at maths, so you identify that as where he has the potential to excel. So you steer him along a path leading more and more to specialisation in maths, and prune away any extraneous subjects. And sure, he ends up excelling in maths - but there's an amazing breakthrough in biology he could have made, except all that was pruned away early in favour of keeping him on the maths track.
The Polgar sisters are chess prodigies, but could they have been doctors, musicians, engineers? We don't know and are unlikely to ever know, because while I don't think their father isolated them with chess alone, that was where the positive reinforcement came in. Doing well at something else was praised, but doing well in chess was where the most attention and most celebration and most reinforcement happened.
What way would they have turned out with a more general education? Would they have been prodigies in a different area, if left to natural inclinations? That's not a question we can answer, but I do think it needs to be asked when we're talking about steering kids to specialise in one topic over another.
Entirely agree - I'm currently expecting and know I am not the sort of person who will be able to steer my own child into doing one thing their whole lives without giving them input. I *do* think this means that I will not raise a "person who gets to be in the history books" level genius, but most "person who gets to be in the history books" level geniuses I read about turn out to have miserable personal lives and feel bad about themselves forever. And an unusual number of them turn out to have serious issues with their fathers so...
> I can’t immediately figure out how to calculate it, so let’s just assume some foragers aren’t rational.
I can't resist that remark. Nerd snipe succesful. First, short answer: After 9 hours of travel, you only have half the time to forage, so you should be able to gain twice as many points per time unit to compensate for that. So if the area at 6 hours distance is 50% depleted, the area at 9 hours walking distance should be 100% virgin area to get the same expected total value. Only after the depletion level in all areas within 9 hours walking distance increases does traveling further become worthwhile.
Compare with the early explorers: Nobody will travel when the area at distance 0 has full value; it is only after the depletion level at close distance starts to become noticeable that people will decide to venture out (and even then, they'll travel as little as possible if they want to maximize their gain).
More general computation. Assuming we are in a state of equilibrium, let D(x) be the depletion level at x hours from camp. Then after walking x hours and gathering for 12 - x hours, you gain (12-x)*100*(1-D(x)) points. In a state of equilibrium, this should be constant, so (12-x)*(1-D(x)) is constant, say C. Then 1-D(x) = C/(12-x), i.e. D(x) = 1 - C/(12-x). Given the assumption that D(6) = 0.5 (you need to make an assumption somewhere), you find C = 3 and hence 1 - D(9) = 1. When traveling further, you'll find that D(x) becomes negative, i.e. the area should have expected value more than 100 points per hour to become worth traveling to. In a model where D(x) must be between 0 and 1, you'll find that the depletion level will gradually decrease as you travel further until you reach D(x) = 0, at which point exploring further does not gain you anything. Note: D(x) = 0 when x = 12 - C. You can measure C by checking the depletion level at any point where people do forage; for example, if the area at 0 hours distance is 95% depleted, then 1-0.95 = C/12, so C = 0.6, and people will travel as far as 12 - C = 11.4 hours to forage for 0.6 hours; 0.6*100 = 5*12 points. Chances are that far before this point it'll become valuable to invest in ways to travel further.
Wouldn't the foragers just move camp? If it takes nine hours to get from present campsite to new virgin area, isn't it simpler to pack up and move the entire camp closer, rather than spend the majority of time travelling to and from the new site, with less time to forage?
Agriculture would be different, as you are rather tied to one area (you can't just pack up an entire farm). Even there, it's easier to drive out your animals to graze in a good area and then drive them back in the evening to be milked, and if it starts to take too much time to travel to and from the pastures, you do things like let the sheep roam free on the mountain and only bring them down at certain times of year (a bit tougher with cattle, but transhumance is definitely something: https://en.wikipedia.org/wiki/Transhumance).
I'm not entirely sure where I'm going with this, mostly that the analogy breaks down for me here. If your foraging grounds are further and further away, you can easily pack up and move closer. You can't quite do the same with fields of knowledge, because you do have to traverse the already-covered ground to give you the background you need before you can get to the new, untapped area.
Oh, absolutely, in my last line I was mostly sticking to the existing analogue that I think Scott intended, in which I interpret the twelve hours roughly as the human lifespan, traveling as time spent learning the basics to be able to understand an unknown field of knowledge, and foraging as actually doing research in said field. In that case 'moving the camp' is (probably) not an option (we all start from birth), but finding ways to travel further might be possible (be it improved education (faster walking speed and/or more efficient foraging), intelligence enhancement (same), lifespan extension (more hours to travel), or [insert favorite way to increase the amount you can learn]).
Sticking to the actual foraging story, I agree that moving camp would in most cases be more sensible than investing in maximum travel distance (especially if you actually manage to reach 95% depletion of the nearby area).
> you can easily pack up and move closer. You can't quite do the same with fields of knowledge, because you do have to traverse the already-covered ground to give you the background you need
I don't think this adequately explains why people who made great discoveries when they were young in 1900 didn't increase their rate of making great discoveries when they were old and bring the average up. One needs to explain 1900-people losing discover-ability as they age, but 2000-people gaining it.
One unmentioned thing can help explain this: extension of healthspan. The mind is the brain is just an organ in the body and if the body is generally dysfunctional the brain will probably not be in best condition either. Being in great health instead of poor health probably at least dectuples the probability of some great discovery. The age-related cognitive decline curve probably shifted a lot due to the extension of healthspan.
I think there's something foundationally missing from this model. Very specifically - what about cranks and weirdos who were retroactively not cranks and weirdos?
More specifically - all of the computer science greats (Dijkstra, Turing, etc) all did their foundational *mathematical* work well well before they were household names (at least among people who are relatively intelligent and science-aware).
There's a great revolution that happened around 1980 that suddenly made computer programming, computer software, and thus *computer science* and all of its offshoots - massively more high-status and important because the Fast Transistor Microprocessor was starting to allow more and more things to use Software.
Without the Fast Transistor Microprocessor - none of that work would be lauded as the genius at it is (Turing, rather famously - went to prison) and would instead be an esoteric curiosity for mathematicians.
I get the feeling that with the amount of Science Infrastructure we have in place today, absent some New Technology or Paradigm that enables a lot of work that was done previously to matter in a new way, or enables new work - most people seeking truth are going to be happily chipping away in the truth mines for proofs or evidence for esoteric effects that aren't super relevant to today. We will lament their lack of progress in world changing and truth seeking for decades.
Suddenly - something will change, some new technology will get invented, or some new mathematical, computational, or scientific tool will become widely known or available, and suddenly proofs from 1960s mathematicians or earlier are no longer esoteric curiosities - they're the world-shaking foundation of the modern universe.
I keep thinking about the time period of the Buddha, Jesus, and Mohammed. (I know that's quite a range of time, but in the course of human history, it's not so much.) Was there just a sweet spot around then for religions? Like, there was enough 'ambient philosophy' around that new and compelling religious discoveries could be made? (Although it's not what I actually believe, for this purpose assume by "discovering" I mean to say, there are certain religious ideas that can be dreamt up that enough people will find compelling that they can gain a real foothold. Discovering is finding one of those ideas.)
Arnold Toynbee thought a lot about this in his “study of history.” He postulated that as civilizations atrophy and then decay, the period of struggle during the long collapse often gives rise to a spiritual upheaval leading to a more enlightened religion. Speaking in broad strokes, Toynbee sees 3 major cycles of civilization (we are in the third), with each time of struggle spawning a more advanced spiritual state (e.g. Baal worship - Judaism - Christianity, with parallels in Asia and India leading to Hinduism and higher Buddhism). This new religion then goes on to define the successor civilization in Toynbee’s model. The time period you are talking about would be for him the struggle period of the second cycle of civilization.
Of course Toynbee is all but cancelled and forgotten these days for his taking religion and spirituality seriously in his historical analysis, and for his sweeping narrative approach which went afoul of postmodern historiography.
If he is right, and if we are in the struggle phase of Western Civilization (debatable), the question is what new spiritual system will arise from the ashes of the West. I feel like that is at least related to your question.
I have a general conviction that hard polytheism is outcompeted by the alternatives because it doesn't hold up to scrutiny or really offer meaning or answer any important existential questions.
But in your comparison, I'd probably nix Mohammed and think instead about Zoroaster. Historians don't agree when he lived -- close to the reign of Cyrus the Great or 1,000 years before? But if we speculate it's the former, then a certain "Age of the Prophet" can start to be seen, centered on Persia and the lifetime of Cyrus, and probably ending with Mani. Cyrus ended the Jews' Babylonian Exile and began the Persian conquest of modern-day Pakistan, on Buddha's doorstep and near-contemporaneous with Buddha's life. He also came towards the end of the age of the Old Testament prophet, though a handful were post-Exile.
Now, as a Christian I'd argue that the prophet of that age was a sort of God-given Jewish "technology" that spread to Persia and finally India, but non-Christians will generally argue the reverse and that Second Temple Judaism imitated Zoroastrianism. I think this is mostly a faith judgement either way.
Steven Johnson has a similar concept he explains in his book, Where Good Ideas Come From: The Natural History of Innovation, called "the adjacent possible." His analogy is that every new discovery opens a door into a new room which contains yet more doors. Each new discovery opens paths to new discoveries.
I have been a bit confused by the premise of this conversation on genius and the perceived implications (concern?) that it seems to be bringing up.
My (oversimplified?) understanding of Hoel's original piece:
1. The world use to product "geniuses" (towering giants in a single field or multi-disciplinarians who made large contributions across many fields). Some of them even made their contributions in their spare time!
2. We don't do this any more
3. This is bad/concerning
4. How can we solve this problem?
5. Aristocratic tutoring?
Isn't this essentially specialization playing out? The reason this doesn't happen anymore is the comparative advantage even for people with the same natural talent as past geniuses is more than overcome by the specialization that is required to make a contribution in nearly all fields. Instead of being a problem, isn't this a natural consequence of all of the efforts of those who came before? As Scott's analogy is pointing out, hasn't all of the low-hanging fruit been picked?
That strikes me as a much simpler answer than a lack of aristocratic tutoring.
Interesting article. I think one element this fails to take into account is the general category of surprise/accidental discoveries. Like Kuhn's paradigm of scientific revolutions on a small scale.
To put that in terms of your example: What if one day little Jimmy the forager trips and and lands face first on a rock and realizes it's edible. It doesn't matter then if he is experienced, smart, old or young.
I think the conceit that knowledge is dimensional is flawed in a number of ways, not least the ways others have already brought up, such as that historical ideas make entirely new ideas possible.
I'll observe that somebody (Cantor) invented set theory. He didn't find a new space in the territory - he created new territory out of nothing.
Sounds solid to me. But I'll nitpick against the claim that `physics is stagnant.' This is arguably true for high energy physics, but physics as a whole remains vibrant, largely by virtue of constantly inventing new subfields (which open up new foraging opportunities). See my DSL effortpost on the topic here https://www.datasecretslox.com/index.php/topic,3007.msg91383.html#msg91383
Per the typology I propose in that effort post, you can divide up physics into six major subfields, of which only one can really be argued to be stagnant. That subfield only accounts for ~10% of professional physicists (according to statistics from the American Physical Society), although it might be more like 99% of `physicists that talk to journalists.'
I have enjoyed Scott's whole collection of posts around research productivity. I want to throw in another ingredient that I think should get more attention.
In most fields, having a research-focused career has gotten dramatically more competitive over the last generation or two. Intense competition can help motivate people to work harder and reach further, but it can also stifle creativity. I'm specifically thinking here about the need to publish and get grants, and how in highly competitive areas it's easy to shoot down an application or manuscript due to some weakness, even if there's something really interesting in it. It's super-extra-hard to come up with brilliant new vistas to explore when you simultaneously have to defend against a hoard of maybe-irrelevant criticisms.
If this dynamic is important (not sure if it is), the only way I see to address it is to somehow collectively limit the number of people who have research careers.
Maybe amateur scientists are less common because our true leisure class is smaller? Even the children of oligarchs like Trump's kids pretend to flit around doing some kind of Succession thing, where in the past it was totally normal to own enough land to support your lifestyle and then go off on a hobby forever.
And, of course, Darwin had several centuries of naturalistic observations and descriptions of life forms and life ways to build on. Without those, he'd have had little on which to base his grand theory. The theory is built on centuries of descriptive work.
i tend to respect my doctoral advisor, and that means i tend to respect the economists he respects, including his doctoral advisor and (presumably) the economists his doctoral advisor respected, etc.
what if "geniuses" are just the genghis khans of science?
That cuts against the observable evidence, however. If women and ethnic minorities entering the sciences allows for increased discoveries we should expect that the rate of discoveries has increased over the past 100 or so years as more women and ethnic minorities have gotten into the sciences. That does not seem to be the case; the correlation would go the opposite way.
"[W]e should expect that the rate of discoveries has increased over the past 100 or so years as more women and ethnic minorities have gotten into the sciences. That does not seem to be the case;"
What? You don't think the rate of "discoveries" has increased over the last 100 years?
I'd say that is a ridiculous statement, but perhaps perhaps I do not understand what you mean by "discoveries" and the "rate of discoveries".
I'd say that last 100 years represents that most significant increase in the rate of discoveries in all of human history.
You might have been missing the thread of this and the previous posts. We are not generating more geniuses in the sense that big break throughs are not happening as much per unit of scientists or time or whatever metric you want. It takes more scientists more time and more money to advance things now than 100, 200, 300 years ago. The discussion is about why scientific progress is slowing down, not why the rate of scientific progress is increasing.
If you have evidence that the rate of scientific progress is increasing I expect Scott and everyone here would love to see it. I would.
The original comment is deleted, but looking at the rate of discovery per scientist feels like the wrong way to assess whether opening up science to women and minorities led to more progress. The way we'd expect that to help would be by increasing the number of people who are scientists (and possibly increasing the average quality of scientists if we think that demand for scientists is inelastic and greater supply pushes up required quality).
This is hard to measure, but Eric Hurst & friends have a paper arguing that something like 20-40% of economic growth since 1960 can be attributed to opening up professional occupations to women and minorities.
It is totally possible that things would have been worse had women and minorities not been allowed to be scientists, and the rate of progress would be even lower than it is. One has to make that argument though. The original comment did not, but rather said that it has been a big boon. If it has been a benefit, that benefit has apparently been overwhelmed by negatives.
Economic growth is very different from scientific advancement as well. It is much easier to produce more stuff by putting more people towards, especially because it is easy to move from low to higher productivity occupations for people in terms of knowing which is better. Higher pay pretty consistently suggests higher productivity. In science it is really hard to do that. You don't see really smart biologists changing careers to become political scientists because that is where all the productivity is. (Arguably lots of failed mathematicians becoming economists has made the field worse, but that's another issue.) So even if it is true that a third of economic growth is attributable to bringing in women and minorities, which I would buy as reasonable, it isn't at all clear that it should likewise apply to science.
I totally agree with you that improving economic output by increasing the number of people available for high-productivity occupations is much easier than increasing scientific progress in the same way. And, of course, measuring productivity is much easier than measuring scientific progress. I'm also happy to believe that the deleted comment was totally wrong :).
One small thing I'd still push on though is that, in my understanding, scientific progress per se hasn't necessarily slowed down. My understanding is that the _number_ of scientific advancements per year has, in fact, increased rapidly. With the example of Moore's Law, increasing a constant rate of increase on the number of transistors per chip means that the number of chips added to transistors in each year is dramatically increasing. Likewise for crop yields. Likewise for the number of patents and research publications per year.
The issue is twofold: first that the rate of increase has slowed down across many scientific domains, and second that the number of researchers employed to produce that rate of increase has increased dramatically (so that, as you say, discoveries per scientist have fallen).
My point is just that the way we'd expect expanding the talent pool in science to be helpful is that it would allow us to add more researchers, not that it would make those researchers dramatically more productive. If anything, we'd expect opening up research to more people to make the average researcher less productive if we have diminishing marginal returns to research effort. So: expanding science to include women and minorities could very well be a dominant factor in our ability to maintain scientific progress over the pas half-century without it having had any positive effect on the difficulty of creating scientific advances.
"If you have evidence that the rate of scientific progress is increasing"
What is the evidence that the rate of scientific progress is NOT increasing?
I see no evidence that scientific progress is decreasing.
The number of geniuses increases with population. 0.1% of the population are geniuses. That has always been the case.
It takes more money ... compared to what? There is more wealth.
The big men of science is a kind of hagiography.
https://www.gwern.net/docs/economics/2019-cowen.pdf
There you go. Lots of discussion about the rate of scientific progress decreasing over a variety of metrics.
Also this discusses the issue: https://www.researchgate.net/publication/347395640_Title_Scientific_Progress_and_Rate_of_Progress
Also also: This subject is talked about a fair bit on this blog.
I don't find either of these at all convincing.
I will not claim that regression is not possible.
And if the claim was grounded in something like the process of growth that Geoffrey B. West writes about I make take it more seriously.
And if someone suggested a coefficient like "what we know" divided by "what we know we don't know", I would certainly entertain the notion that we (humanity) have been in a constant state of getting dumber.
But if we divide human history into let's say 35 chunks - maybe ~5000 year periods, I would say there is no evidence that the rate of scientific progress could in any way be said to be decreasing.
The number of journal articles has increased exponentially.
I believe the number of significant discoveries has declined dramatically since about 1970. Certainly the life of the average American has changed less in the past 50 years than in any other 50-year period since America was "discovered".
What? Fifty years ago there was no personal computing, internet, sequencing of the human genome.
The modern world was roughly in place in the 1920s-1930s; everything after electrification* has been a footnote.
*Except antibiotics.
And vaccines?
Understanding DNA?
Nuclear weapons?
Computing?
The internet?
Birth control?
Agricultural improvements?
All footnotes?
No; women and ethnic minorities entering the sciences doesn't magically create more jobs in the sciences.
But also, the increase in the number of scientists over the past century is already much, much greater than a mere doubling.
Hunter vs gatherer.
https://prezi.com/nppxy7sixjbu/hunter-gatherers-and-gender-roles/#:~:text=Gender%20roles%20were%20specifically%20defined,were%20performed%20close%20to%20home.
Mm, i was under the impression that the idea that most hunter-gatherer societies were egalitarian is pretty discredited, differently than what implied in the presentation
No idea, but I would assume that hunter gatherer societies would find it harder to develop the large economic surpluses necessary to support a hierarchical society.
This is not the case. Some like the Pacific Northwest Native Americans had substantial surpluses and conspicuous consumption, but anyway..
How were their societies arranged? Did they have a priest class? Dedicated warriors? Clearly defined leaders?
I was delighted to find out that the opening weekend of deer hunting season in Michigan is an especially prosperous weekend on the Magnificent Mile in Chicago because it has become traditional for the wives of deer hunters to flock to the stores to do some expensive gathering.
What delighted you about it?
Middle class American hunters and gatherers.
Rather than political effects or the mechanical effects described herein, I think there are also effects relating to ideology: amateur scientists don’t exist as much because people don’t believe amateur thinkers can provide value anymore, and so they don’t even try. This is distinct from the aspect of only believing credentialed figures when told things, though it is related.
There are strong arguments to be made that a number of scientific fields are wrongheaded in some fashion. In the 1980s, doctors were telling people to avoid SIDS by having babies sleep on their tummies, and now they insist quite strongly the *exact opposite is true.* Numerous “paradoxes” around infinites seem to indicate, at least to some, that maybe we are on a false assumption or two there. Professional physicists have failed to reconcile GR and QM for decades.
The mechanistic model here doesn’t address the “revolution” problem of science: where some philosophical or other assumption is overturned by some “brilliant” new idea (ones that can often be more common with amateurs than professionals - Einstein being a patent clerk is a good example.)
It's not enough that current fields are wrong in some way, it has to be a way that an amateur has some real chance at correcting. I don't know anything about SIDS, but I expect it would take a lot of work for an amateur to even properly understand the open questions when it comes to infinities or quantum gravity, having occasionally tried to understand these myself, and that without this work there is no chance of them making a contribution.
What paradoxes do you think infinities have that modern mathematics fails to resolve?
Whilst I'm not qualified to write of hubris in the medical field, there are a lot of 'boots on the ground' folk who make a lot of good observations, but those folk are not Dr. Fauci level, thus their observations get blown off.
For instance, I work with a guy (non-medical) who ran a COVID testing clinic for a while. Today, he's a true believer in masks—I think he was luke-warm before. But he says "I checked in 91 positive cases in one day, and didn't get sick."
I think it's going to depend a lot on the field. All fields will have some degree of politics to them, because humans are human, but in some fields approximately *all* the barriers are political (eg. many humanities), whilst in others, the knowledge barriers are extremely steep (eg. maths, theoretical physics), or the financial ones are (many experimental sciences, but especially medicine).
Personally, I studied experimental particle physics - while knowledge is somewhat of a barrier there, the biggest obstacle by far is that getting any useful data requires massive machines that take hundreds or thousands of people to build and maintain and cost potentially billions of dollars (I think the LHC is more expensive than most experiments in the field, but the others aren't cheap and they're complementary - you can't do what the LHC is doing without spending billions.) Theoretical particle physics, by contrast, needs little more than a good computer, but the maths is brain-meltingly difficult and all the obvious ideas have been published decades ago.
> amateur scientists don’t exist as much because people don’t believe amateur thinkers can provide value anymore, and so they don’t even try
There are people like this, the trouble is that they tend to be cranks. Cranks are characterised by an excessively strong belief in their ability, as amateurs, to make great scientific discoveries.
Is there a sweet spot between academic insider and kooky crank where you can still make important discoveries? Well maybe, but in most fields you're going to have to spend a lot of time catching up with the current state of the art before you can even find a problem worth thinking about.
I think another cultural piece here is that it only seen as allowable for cranks to contribute. Sober/serious people just focus on their businesses or what have you rather than trying to email Sean Carrol or what have you.
Yes, but the very amateur discoverers were probably seen as cranks by their peers. What do you think of some dude who spends his time polishing little glass beads and and trying to look through them? That's van Leuwenhoek trying to make a compound microscope ... but to his neighbors, he's some crank who polishes little glass beads.
And the same person can simultaneously be a revolutionary legend in one field and a crank in several others: https://en.wikipedia.org/wiki/Isaac_Newton%27s_occult_studies
Or more recently: https://en.wikipedia.org/wiki/Linus_Pauling
Regarding your SIDS example, I'm no scientist, but I was a defense lawyer long enough to see that medicine is science plus something that isn't science.
Claiming certainty about the human body even in a particular case is usually stating too much.
Certainty about general health advice is so far removed from science it's better seen as akin to trends in fashion.
That just means medicine isn’t fully accurate yet. The same is true in all sciences, it’s just that physics has come a lot further than medicine.
It is really difficult for an amateur to understand state of the art in a specialized discipline, let alone improve on it. You need instruction from people who'll correct your misunderstandings and point out helpful literature, at least, and at that point you're in a PhD program.
Perhaps the premodern equivalent of a PhD program was just chilling with your philosopher buddy, but that is no longer enough.
> amateur scientists don’t exist as much because people don’t believe amateur thinkers can provide value anymore
I think this is true. I also think that this belief is almost certainly correct.
I have an academic position in a world-leading astrophysics department. As such, I receive a LOT of unsolicited theories from amateur thinkers, almost all of whom believe that they have produced new insights which will push physics forward. It's always very high-concept stuff, like using quantum theory to do away with Dark Matter, or combining electrodynamics with relativity to disprove the Big Bang. Etc.
I do tend to read them fairly carefully, and without fail these authors have a very poor understanding of the topic at hand. The best of them make silly mistakes which would be obvious to an undergraduate; the worst of them seem little more than schizophrenic ramblings.
I think we're past a point where amateur researchers can usefully contribute. If you want to do cutting edge science, you need a more experienced scientist to show you the layout of the field, recommend relevant literature, and (importantly) correct your wrong ideas. I was in a top PhD program, and almost all of my cohort had the experience of coming up with what felt like a smart new idea, and being told 'actually Smith et al. tried this back in 1978, and it doesn't work because [x/y/z]'. Without this guidance, even smart people are going to be hopelessly lost. And once you get this guidance, you're no longer an amateur.
Just to chime in, I think the same is true in biology. A lot of resources needed even to answer relatively simple questions. To answer hard questions, you need an extremely strong foundation based upon mentoring and a lot of study that an amateur would have a tough time achieving.
I think this is probably more true the harder/more scientific gets, but it certainly doesn't hold for everything. A friend of mine is a mechanical engineer, and has published some linguistics work that has legitimately pushed forward the frontiers of knowledge regarding the language in question, even in the estimation of professionals studying the same (and related) languages.
Linguistics may not be astrophysics or biology, but there's still rigor to it, and it's still possible to be obviously wrong in a way that isn't true of, say, philosophy.
Sliding more toward the philosophy end of the scale, one of my favorite pieces of history writing (Six Frigates) was done by an amateur, and it's well regarded by scholars, too.
Both speak to the accuracy of the foraging metaphor in different ways, I think.
I suspect that what you really want is someone who is outside this discipline (so they don’t share all the same starting assumptions as everyone else) but has been trained in another discipline (so that they have the discipline of thinking rigorously).
I agree with this, but note that some disciplines still just have insane amounts of existing knowledge to absorb first. I think people with the properties that you describe are far more likely to come up with genuinely good ideas than most amateurs, but will still meet the rebuttal "actually Smith et al. tried this back in 1978, and it doesn't work because [x/y/z]" fairly often. Coming up with good ideas is only one part of the problem, they also have to be novel.
I think there may be more wiggle room in the humanities for this; you can be an amateur sitting at your desk going in minute detail over old publications and chasing trails along new lines of thought as a secondary interest to your main job. There is still room for a Michael Ventris in these fields:
https://en.wikipedia.org/wiki/Michael_Ventris
Once scientific advance has gone beyond "work it out with a pencil and paper", you really can't do that on an amateur basis; as noted above by several, you need the labs for the practical work and the advisers to steer you away from dead-ends.
If you’re going to be the amateur genius who deciphers Linear B, it helps to have an Alice Kober do 20 years of heavy lifting on the project before you get started.
Kober was a Columbia-educated professor of Classics who spent nights and weekends for decades doing the kinds of frequency studies you could now do in seconds with a computer. She made 180,000 index cards.
Kober collaborated with other specialists, but didn’t publish about her work on Linear B until she’d been working on it for 15 years. She won a Guggenheim Fellowship to devote herself to the problem full time for a year. And then, perhaps on the brink of cracking Linear B, she died. Michael Ventris and his collaborators inherited all the resources she developed, which they acknowledged.
It’s not like Kober wasn’t recognized, but Ventris got the lion’s share of the fame while it could be argued that Kober, who was no amateur, did the lion’s share of the work.
Are you willing to mention the specific linguistics work?
Sure—he's been doing comparative linguistics between documented varieties of the Nivkh language (or Gilyak, in some sources; it's nearly extinct but was historically spoken in the Amur basin and on Sakhalin Island).
"Application of the comparative method to vocoid sequences in Nivkh" should get you going in the right direction, if you want to know more. Most (all?) of his Nivkh-related output is in Kansas Working Papers In Linguistics, but he did get a paper on Quechua into IJAL.
Einstein might have been a patent clerk, but he already had university education and physics and was also working on a PhD dissertation while working; Einstein is a good example of a young person creating a revolution, but not a strong example against credentialism or anything. (I'm getting this info from https://en.wikipedia.org/wiki/Albert_Einstein#Patent_office). Also, his patent clerk job was still actively involved his science background.
If he was merely working on his PhD, then he didn't yet have the credential.
Sort of. "Physics grad student" is a weaker credential for competence in physics than "physics PhD", but it's a stronger credential than "random patent clerk", and physics grad students are taken seriously as producers of physics research in a way that patent clerks with no particular formal background in physics are not.
For example, I'm pretty sure that today, reputable physics journals publish a lot more papers written by physics grad students than papers by clever laymen.
Someone "working on a PhD dissertation" has already qualified for at least a Masters, even if one refuses to factor in the x% of a PhD.
PhD students are actively expected to produce and publish novel research, it's literally a requirement to graduate; it's just that these days "incremental improvements" are proportionally even more of the publications than ever before.
I think the more important question is, are there no longer opportunity (and/or motivation) for brilliant 20-somethings to earn a living (in a job that fits their education), work on PhD dissertation, and still have enough free time to let their mind wander and think about fundamental problems of the science?
The most common complaint I hear is that if you are a PhD student working in "a lab", great deal of your time goes to either teaching assistant duties or making experiments your professor instructs/orders you to do.
People call it an master - apprentice model of science education and think it sounds nice and old-timey. Very few people remember that apprenticeship was often grueling work done on very unfavorable contract terms. Benjamin Franklin ran away from his brother's print-shop, and founded his own in Philadelphia (and became successful). So did Rousseau, apprentice to an engraver in Geneva, who ran away and found himself a protegé and later lover of random French noblewoman.
It varies by country and by discipline. In my case, all my time was for research towards my thesis, though classmates who weren't still living with their parents spent a not insignificant amount of time teaching undergrads for money. However, doing research directly necessary for my thesis was a more-than-full-time job, so idle chats about the great mysteries of the field were rarer than I'd have liked.
By contrast, my girlfriend did a PhD in literature, and describes a positively relaxed workload.
In the Days Of Discovery, a scientific education—any education—was available to only a very few. Thus Age Of Discovery people whom we consider amateurs' were the very people who would have vocational degrees in today's world.
Perhaps "revolutions" happen when a new territory is opened up, rather than discovering another idea within the existing territory.
Einstein was no amateur - he was a graduate student in physics before he got his "day job" of being a patent clerk.
One other place where this theory would predict a difference is in the artistic domains. Since we explicitly value novelty, we don't run out of easy melodies like we do easy scientific discoveries.
Music fits this theory in some ways (the most acclaimed artists are young) but not in others (to succeed you need to dedicate all your focus).
Unfortunately, these areas are subjective so determining decline is impossible. But if decline is real we would expect a steady decline in artistic greatness over time.
Though modern instruments also represent an unforaged area. Beethoven did not have a turntable, Bach did not have an amp. And I don't have stats on this, but I'd suspect modern classical composers (movie composers?) tend to middle-aged.
We do have a lot of “genius” pop and rock stars though. That usually start young. Would they not count?
I think the degree to which novelty is valued in the arts is overstated. It's valued by (some) critics; not so much by the wider population. At the very least, insofar as they care about novelty, the audience only care about novelty relative to what they've experienced before themselves, not whether something is objectively novel, so the good ideas can be mined again and again with each generation, rather than being depleted.
You can't be considered a genius in the hard sciences by rederiving all of Einstein's equations; John Williams can become one of the most celebrated orchestral composers of his day by (masterfully) emulating the techniques, and in some cases tunes, created by his elders like Gustav Holst and Eric Korngold, in part because most moviegoing audiences who latched on to Williams had never really heard much Holst or Korngold.
(Maybe the argument is a bit facile, but it's easy to construe the much-discussed remake/adaption obsession of cinema and television in the 21st century as the ultimate consequence of this. All the 'best' ideas, in terms of pass appeal, *have* been found, so now they're just getting recycled over and over instead of barrel-scraping for new ones. A simplification, I think, but I believe it does point at something real.)
My problem with this model is that human genius should be at least semi-constant on at least a per capita basis if it's primarily genetic. If it's primarily environmental then you should expect to be able to produce it in a way we haven't been able to. If it's a combination (like I believe) then you're waiting for the right environmental conditions in which genius can express itself.
However, this has a marginal effect. In the worst conditions one or two geniuses will shine through. In moderate conditions a few more. In abundant conditions many. But once you have many geniuses it makes sense to specialize. When you're one of twenty scientists in England then it makes sense to do five things and to make foundational but ultimately pretty rudimentary discoveries about them. When you're one of twenty thousand then it makes sense to specialize in specifically learning about... I don't know, Ancient Roman dog houses. This creates more and higher quality knowledge. But it creates less towering geniuses.
Further, keep in mind you don't have to outrun the bear, you just have to outrun your competition. You can get a lot wrong and so long as you're relatively more correct you'll do well. This also explains how amateurism decreases. A few hundred years ago I'd probably be able to make some serious contributions to a variety of fields. Now I can't. Not because I do not know those fields or have interesting thoughts about them. But because now I don't have to contend with a few widespread amateurs. I have to contend with several thousand people who have spent their entire lives studying whatever as a full time, professional job.
This all seems reasonable as far as it goes, but maybe the impression that we are producing fewer geniuses nowadays is more due to relatively contingent and parochial-to-humans facts about the sociology of fame assignment within groups (as hinted at briefly in point #3) than it is about great feats of insight, or what make for good examples of creativity or impactful problem-solving or whatever.
In the forager analogy, the other foragers considering the problem of who finds good fruit sources are only able to consider foragers that came to their attention in the first place, and that could be due to reasons other than the actual fruit-harvesting (especially if the fruit harvesting had impact as hard to quantify in isolation from frame as does that of scientific genius).
>maybe the impression that we are producing fewer geniuses nowadays is more due to relatively contingent and parochial-to-humans facts about the sociology of fame assignment
Right. This debate seems to actually be about "What makes geniuses celebrities?" One thing that helps make geniuses celebrities is having been born before the 20th century. 20th and 21st century media have made it harder for geniuses to compete with non-geniuses in celebrity space.
How much of our definition of "genius" depends on celebrity, though? Certainly there have been some famous-with-the-general-public-in-their-time scientists, but would a lot of the people we now label as geniuses get recognized on the street when alive?
Well, "fame within groups" doesn't necessarily imply fame-while-alive, or fame-with-the-general-public. For whatever that is worth. Scott's Example #3 was contemplating the politics of small research groups.
My point is that our awareness -- though not our definition of -- genius depends on fame. How can one person take stock of the number of geniuses across hundreds of fields without using fame as a heuristic?
> would a lot of the people we now label as geniuses get recognized on the street when alive?
I suspect many of their names would have been known by upper and upper-middle-class people. All the examples cited in "Contra Hoel": Newton, Mozart, Darwin, Pasteur, Dickens, and Edison were celebrities within their lifetimes.
There are a lot of examples of famous geniuses who died in obscurity, but they almost all died young.
In 1975, my high school teacher said, contra the Romantic notion that great artists aren't appreciated within their lifetimes, that practically everybody who is famous today was famous within three score and ten years of his birth. E.g., Van Gogh only sold one painting in his lifetime, but if he could have lived another 33 years to age 70, he would have been rich.
Vermeer _might_ be a counter-example. On the other hand, he seems to have been appreciated enough in his own lifetime to have the luxury of working extremely slowly, turning out only one or two paintings per year. But then he died fairly young, and soon after Louis XIV of France attacked the Dutch Republic, which ended the Golden Age of Dutch painting as the economy shrunk.
Vermeer's repute, whatever it was during his lifetime, then faded, although Paul Johnson says that a small line of connoisseurs passed down word of Van Eyck's greatness for almost two centuries until he was generally rediscovered in the Victorian age.
Here's a way to measure this question testing a pre-selected sample: How many Manhattan Project scientists would be completely unknown walking through, say, Grand Central Station in the decades after 1945.
I'll give you my subjective opinions as somebody born in 1958 who isn't bad at remembering what was common knowledge and what wasn't. But I'm not a scientist, so the following doesn't reflect the opinion of a professional.
To the extent that Einstein was involved in the atom bomb project, yes, he was an immense celebrity, as famous of a face as Marilyn Monroe.
I think Oppenheimer would have attracted attention from a not insignificant fraction of the passers-by. He was a very distinctive looking man with a gaunt Cillian Murphy-like face (I suspect Christopher Nolan is making his "Oppenheimer" biopic to give his friend Murphy a major role.) Of course, much of his fame/notoriety derived from the hoopla over his security clearance being stripped in 1954 due to his many Communist friends.
Von Neumann was less unusual looking, but he was on TV a lot before his early death.
Fermi's picture was widely shown.
Teller was on TV a lot in the 1970s arguing in favor of nuclear power and the like.
Bohr was a giant, but I have no recollection of his face. Same for Bethe, Wigner, Szilard.
People that aren't all that famous anymore like Seaborg might have been on TV a lot.
Feynman is a folk hero today, but I can recall reading James Gleick's magnificent full page obituary for Feynman in the New York Times in 1988 and thinking to myself, "Wow, this guy was awesome, why did I never much notice him before?" Obviously, Feynman was a legend among physicists while alive, but I don't think he made much impact on the public until the space shuttle explosion hearing soon before his death. And, even then, I wasn't aware of his now legendary O-ring dunking in ice water demonstration until his obituary.
>20th and 21st century media have made it harder for geniuses to compete with non-geniuses in celebrity space.
This sounds plausible. New media (radio/TV/internet) does seem to have more of an effect on the fame of athletes, entertainers, charismatic politicians, and talking heads than on the fame of scientific innovators and sophisticated theorists. This is most likely a shift from the prior era, when fame was built on word-of-mouth and journals/newspapers.
One assumption here is that there is some limit to the number of people we can meaningfully know by reputation, at least without dedicated memorization. I guess this would be similar in effect to Dunbar's number (but presumably not similar in cause).
Scientific innovators do get some new media coverage. But the increased specialization of these innovators' fields, mentioned elsewhere in this thread, makes their achievements less comprehensible, even when explained well on slickly produced video. And so we probably retain less of what we do learn about these innovators. Which would exacerbate the new media effect.
I think the other thing is that often new scientific achievements are done in teams. Much harder to remember 5 names than the 1. Can we meaningfully remember the scientist that developed mRNA vaccines for COVID? No, because they're scientists plural.
I think there is some evidence that support the idea that ML researchers make breakthroughs at younger ages. The classic example would be Ian Goodfellow who invented GANs while a grad student. Also the Turing Award winners, LeCun, Hinton, Bengio, all did their seminal work while much younger.
I don’t buy it. This assumes the subset of the space that’s been searched is a significant fraction of the total space (even if you just consider the “easy” subset). If it’s small, you can always just move slightly to the frontier of the set and find new low-hanging fruit. There’s no reason a priori to assume that this region should not be huge.
In my area, theoretical physics, I see plenty of interesting research problems that are no more difficult than problems a generation or two ago. In many cases, the problems are easier because we have much more powerful tools.
I do, however, see the field unable to pay grad students, unable to get grant money relative to other fields, hemorrhaging good students to outside of academia, trapped in a nightmare of academic bureaucracy, and with an increasingly large number of outright crackpots.
I suppose that one of the problems with modern theoretical physics is the enormous cost of the machines necessary to experimentally confirm the theories.
Or a lack of sufficient insight and creativity to imagine low-cost experiments that could do the same. What did the microwave antenna cost, which Penzias and Wilson used to provide probably the single best piece of evidence ever for the Big Bang? Had you asked someone in 1960 what an experiment that would provide solid evidence for that theory might cost, there's a pretty decent chance he would have named some exorbitant figure -- because the idea of just pointing a big microwave antenna at an empty patch of sky hadn't occurred to anyone.
I had dinner with Nobel laureate Robert Wilson in the late 1970s. He was very modest about how he didn't know that his discovery of universal background radiation proved the Big Bang Theory until some Princeton physicists explained it to him and Penzias. Princetonians have complained about that Nobel ever since.
But as another astronomer asked me a few years ago, "Do you think it's a coincided that they gave the Nobel Prize to the best experimental radio astronomer of his generation for his greatest discovery?"
In physics I think that a large fraction of the space has already been searched. That is, for pretty much any physical phenomenon you can think of, we have a pretty good explanation. A few centuries ago a layman could sit around and wonder "What the heck is the sun?" or "What's the deal with fire?" or "Where do rocks come from?" but nowadays you can find pretty good explanations to all of these in books aimed at eight-year-olds. We've harvested pretty much the whole space of questions that an interested layman might be able to think of.
The only questions we still can't answer tend to be either (a) extremely obscure questions that only an expert could think of, or (b) possibly in-principle unanswerable like "What happened before the big bang?" and "Why is there something rather than nothing?"
I'm not sure this is true. From things I know about: somnoluminescence does not have an accepted explanation. The Russian toy that turns one way only is easy to simulate (and the simulations do agree that it spins in one direction only), but there is no model that describes the motion (without the need to integrate) in a way that makes it obvious that it will soon in one direction only.
Not really.
What the heck is turbulent flow and how to predict its behavior?
What is ball lighting?
Why is the sun's corona way hotter than the surface?
Why can you wring gauge blocks?
What happens to the airflow inside a blues harmonica when you bend notes?
Why do chain fountains work, exactly?
Can you jump into a supermassive blackhole and survive, as the gravitational gradient wouldn't tear you apart? What would happen from your perspective?
While sounding simple, these are all (sounding to me) as derived and deeper exploration spaces of the much more accessible and understandable questions, like: "why do things fall" "what is water" "what is sound" questions. To see black holes we are "standing on the shoulders of Giants" and to answer many of these we need tech and knowledge invented by exploring the much more simple questions.
I'd never heard of wringing gauge blocks.
https://en.wikipedia.org/wiki/Gauge_block
You're welcome.
Barring perhaps turbulent flow, these questions seem much more esoteric than the kinds of questions that you could reasonably ask 2-300 years ago. Like, they could say, "What is lightning," you ask, "What is a super-rare kind of lightning that almost nobody has ever seen?"
That's because you are privileged with knowledge a person 2-300 years ago would have lacked. Everyone as he gains information finds the questions that puzzle someone at a lower level seem simple, and questions that puzzle him at his current level seem complex. But what's "simple" and "complex" change predictably with your current perspective, just like what's "poor" and "rich" change with your own current income.
I mean, I think it's not. Isn't my example pretty clearly an example of something that's just vastly more esoteric by any standard? We have explanations for almost everything that we commonly encounter, and the things we lack explanations for are almost entirely things that are ultra hard to observe. This was not true 300 years ago.
A set of gauge blocks is under 100 dollars, and they're routinely used by at least a 6-digit number of workers in the US. Not exactly rare.
Some of the others are certainly bad examples though. We can set up accurate differential equations to describe turbulent flow physically, but they're just not mathematically well-behaved. They're not SOLVABLE of course, as complex differential equations tend not to be, and they're absurdly sensitive to initial conditions and simulation error.
This isn't a flaw in the equations though; it's them accurately representing the state of nature. Turbulent flow itself is horribly behaved in much the same ways, and virtually impossible to force replicable flow outside of the most carefully controlled lab environment.
This. Reading up on gauge blocks, since I'd never heard of them before, is the molecular adhesion that mysterious? It seems like the sort of thing where the details might be hideously hard to compute but the gist of it is simple enough.
Well, we know the equations for fluid flow, and have done for over a century. The fact that they aren't practically solvable is the issue. This one is one I could see an "amateur" solving, in the sense that I expect advances to come either from an obscure mathematical advance (if analytic) or from an advance in computing (if a numeric approximation) rather than from physics directly.
As for black holes, that's in the "literally untestable even in theory" category - easy enough to say that you'd survive to the event horizon if the grav gradient were sufficiently low and your craft sufficiently shielded (the accretion disk gets extremely hot), but the very definition of an event horizon is that what's inside is unknowable.
The others are much further from my wheelhouse, and I haven't even heard of gauge blocks or chain fountains, which rather argues against them being the kind of phenomena laymen think about deeply.
Another one that took forever to get a good answer for: how does ice skating work?
https://www.vox.com/science-and-health/2018/2/13/16973886/why-is-ice-slippery
You should check out Steve Mould's youtube channel. He's got lots of videos about this kind of physics of random everyday stuff. Sometimes he comes up with a convincing explanation, occasionally there's still a lot of mystery left about how the thing works. I bet there's at least a few videos there where a detailed explanation of exactly what's going on would be new to science.
Also: No one knows how high temperature superconductivity works, though I guess you'd say that's not accessible to laymen.
given the price of high-temperature superconductors, no, I think you need to be a fairly well funded lab to do any experiments on those.
I will, however, not that a lot of historical "lay scientists" were aristocrats who could throw rather a lot of money (for the time) at their experiments, so maybe it's not entirely fair to talk about purely financial barriers to a field?
If the problems are easier because we have much more powerful tools, then you don't get as much genius cred for solving them.
"This assumes the subset of the space that’s been searched is a significant fraction of the total space ... ."
It assumes that searching the subset of space that has been searched has consumed all the "search energy" of the searchers to date. It says nothing about outer limits.
What's your starting point for "a generation or two ago"? I might agree that many of todays' theory papers could have been published in 1990 if the computer tech were available, but also I have the impression, in my field at least, that major theory advances have been slow since the 1960s.... (which is not to say it's the fault of theorists at all, but rather that many of their theories are extremely hard to test and there's seriously diminishing returns for the 100th untestable theory to explain something). Some of it is also just luck - Super Symmetry is a beautiful elegant theory, and when it was first thought of it seemed quite plausible that it would be true, but when the LHC finally got running, there was no sign of any SUSY.
Scott, I think this model has less explanatory power than your previous* model, because it fails to account for discoveries which make other discoveries more available. For example, had Newton invented calculus but not the laws of motion, this would have reduced the depletion of the forest in that area, because some things which *could* be discovered without calculus are much easier to discover with calculus. Maybe you could throw in something like builders (teachers in real life) who build roads which make other areas easier to reach?
The point of this is that a more innovations in whatever makes things easier to understand, maybe educational psychology (if its effective at making better learners at scale, which idk) will reverse this trend, and the model should have something to reflect that
*https://slatestarcodex.com/2017/11/09/ars-longa-vita-brevis/
I agree with this argument. Scott's model assumes scarcity - scarcity of fruit, scarcity of knowledge, scarcity of science. However, a dynamic that may apply to fruit, might actually be inverse for science/knowledge. Knowledge begets knowledge. It's a tree that continually branches ad infinitum. 1000 years ago, medicine was a shrub, whereas now it's a massive tree the size of a jungle (sorry for the sloppy metaphors). An invention such as CRISPR or MRNA technologies, opens up new frontiers for tens? hundreds? thousands? important inventions.
Taking educational psychology as an example - Piaget may have made quite important discoveries, but the "science" or the "knowledge tree" of educational psychology is still a shrub. Perhaps it will be a jungle in a few hundred years and everything we think and do vis-a-vis development/education will be different. If important discoveries in educational psychology are made, they are not decreasing the discoveries to be made, but organically? increasing them.
An excellent point. If this branching tree metaphor is better, and I think it is, we might expect some branches to crap out earlier than others (so we get all the breakthroughs and know everything about the field in that branch) but there should be ever more questions to answer, and more discoveries to be made.
If I were to try and tie it back to Scott's metaphor, advancing technology should allow for more efficient searching over time. You build roads (someone mentioned this above) you get horses, you stop going towards places that just don't seem to have more food. This should allow you to access increasingly more food with even a linear progression in how fast you can cover distance because the area of the circle gets bigger.
Of course, Scott's model doesn't include things like "Keep going over the same barren, dead end ground because someone wants to pay you to cover that ground, because they want that ground to be true." Sort of the normative sociology that economists make fun of: the study of what should be causing society's ills. Most sciences that even tangentially relate to public policy start to fall into this trap.
To apply Scott's metaphor, scientific discovery is really all about the mushroom caves, not the flat ground. Random events or quirky observations open up whole new approaches to understanding the universe, which then quickly get fully explored by more workmanlike research. This is basically Kuhn's model, and I'm not sure why Scott tends to ignore it--especially since it helps explain why the enormous edifice of the professional scientific community, which is primarily concerned with picking over known ground, produces fewer breakthroughs than its size would predict under Scott's "foraging" model.
I only read about the first third of Kuhn's big book about Paradigm Shifts, but I came away with the impression that, to my surprise, its celebrity is deserved.
These metaphor extensions are fine, but the core virtue of Scott's metaphor (scientific progress ~ picking reachable fruit) is that it's very simple and yet generally fits the data on how our scientific knowledge grows. Or are you suggesting that there is some systematic data that it fails to predict well?
Yes, it fails to predict the actual pattern of scientific advancement, which more closely resembles a series of random bursts of varying size, depth and speed than a slow, steady march.
I think there's an underappreciated aspect here which is that lots of advances may be happening but it gets harder and harder for Scott (and other laymen) to appreciate them.
My subfield was effectively inaccessible until the last century, and my impression is that most scientific fields are similar. No one is asking questions that were conceivable to ask 100 years ago, maybe even 50 years ago. In the metaphor, we set up new cities close to the frontier, and get better and better railways over time. By the end of grad school, you've mastered a multitude of ideas that are pretty new and unexplored, because once someone figures something out it they can teach others.
So this model feels wrong from inside. I'm not constantly walking over already-known things, I learned the modern stuff in school and now everywhere I look is un-tread territory.
But that's a subfield. If you look at math as a whole, there was more big re-defining work in the 1800s than the last 100 years. So to me, things are getting shaken up and expanded all the time, but if you lack the expertise to understand individual subfields, it's hard to explain what new things are happening.
It's unclear to me if the laymen are too far away to see the truth, or the experts are too close to see the truth.
It's often said, and seems objectively true, that far more math has been invented in the last 50 years than the rest of human history put together. One reason for that is there are massively more mathematicians in the world now, for many reasons. So something is happening, the question is how to value it and how to value the people who do it.
I don't recall hearing about Alan Turing and John Nash when I was young, but today, Turing and Nash are the heroes of Academy Award nominated movies. So, fame progresses. There are probably brilliant people born in the 1990s who are little known to the general public today, but who will be folk heroes in the second half of this century.
I think some of it is the distance between novel science and practical application, which has a large time lag and a large component of luck. Maths in particular seems to have a habit of going from obscure trivia to pivotal bedrock hundreds of years after discovery, which of course if good for the fame of long-dead mathematicians and terrible for the fame of currently active ones.
E.g., in 1938, Claude Shannon pointed out that George Boole's binary algebra would work ideally in electronic thinking machines.
Scott addresses this in number 5.
Came here to recommend Ars Longa, saw you already did it.
Plus, some of the early geniuses may well be “names associated with” rather than “sole inventor of.”
For example, despite improvements in the history of science, I bet there were still some husband and wife teams where only his name is remembered (at least in popular culture).
Or Darwin: clearly his ideas grew out of the shoulders of the giants upon whom he was standing, that’s why other people were able to come up with them as well. But we don’t remember the names of those other guys. Similarly for Newton/Leibniz: sometimes the genius halo grows more out of our desire to have a single historical hook on which to hang our story of scientific advances, rather than a deep understanding of the science process.
And if our perception of past genius is distorted by the lens of history, then our comparisons with current geniuses will be less accurate.
"If I have seen further it is because I have crawled to the top of this mound of dead dwarves" doesn't have quite the same ring to it.
Peter Higgs of Biggs boson fame.was one of about six co auth ors.
And the mechanism in question was actually discovered by Anderson several years before Higgs. Higgs himself credited Anderson for the mechanism. But (a) Anderson already had a Nobel prize and (b) he had made himself persona non grata among high energy physicists over the Superconducting Supercollider affair, and no way they were going to let `their' prize go to Anderson...
of which, IIRC 3 shared in the prize? at least one was dead by the time it was awarded, which is not that surprising when there's 60-odd years between the prediction and the experimental confirmation.
This model seems a bit oversimplified in two important ways.
1. Ideas don't really "deplete" like this. Say you come up with some good ideas around factoring prime numbers. Someone else invents the computer. A third person puts them together and gets RSA. All three of those are good valuable work, but I wouldn't think third idea was "further out" than the first (in terms of how long it would take to get there). It was just gated on the computer.
Lots of ideas are like this -- simple, but dormant until the other necessary ingredients are ready.
2. The campsite "moves" over time. A whole lot of our cognitive technology is encoded deep in our language, tools, norms, etc., and isn't fixed year over year. Even if today's people and yesterday's people could travel the same distance on average, today's people would still be biased to discovering new things -- just by virtue of starting off somewhere else.
Some of this technology is more literal: computers are something like a bicycle in this metaphor. The early astronomers were analyzing data by hand!
Point 2 is pretty important. Tools develop. But not all of these are linear: needs develop, too, changing what tools we _think of_.
Analogy, we tame pack animals and horses, and carry more and further. Eventually we develop agriculture. We grow big trees, but now to get more fruit, we need to go higher... going further is not an option.
So someone develops a ladder. But someone could have developed a ladder at any time, if they had spent time solving the problem of climbing higher rather than solving the problem of walking further and carrying more. Differing needs can give us insight into spaces that were always there.
Machine learning is definitely still one of the low hanging fruit areas. In this case, you can turn a drug discovery ML system into one that can discover VX nerve gas and a whole new, exciting range of chemical weapons just by inverting the utility function....
https://www.nature.com/articles/s42256-022-00465-9
Worth noting that the objective function for a machine learning algorithm that finds nerve gas is not in any sense the inverse of one that finds a drug. Nerve gases ARE drugs, from the perspective of a machine learning algorithm. Just drugs with specific endpoints, a high premium on low-dose requirement, and a different set of example drugs to work from.
(Forgive me if this point has been made already, I'm writing this comment quickly)
I've been thinking about this a bit recently because I'm trying to write a piece about a related topic (and I listened to this interesting BBC documentary https://www.bbc.co.uk/programmes/m0015v9g on the slowing of science and progress). There's another mechanism which you don't model here: in the foraging model, finding fruit only makes it harder to find new fruit. But in science and tech, a discovery or invention makes future discoveries or inventions easier.
For instance, a wood-handled flint axe is a combination of two earlier inventions, the stick and the hand-axe. Newton's observations about gravity are possible because of the earlier invention of the telescope. The invention of the iPhone 13 is possible because of the earlier invention of the [various things, transistors, touch screens, etc].
So there's a countervailing force: individual discoveries become harder *given a base of zero knowledge*, but there are also new discoveries that become possible because they are simply combinations of earlier discoveries (or new technologies make them more accessible).
In your model it might be more like you're loggers, rather than foragers, and cutting down some trees allows access to new trees, but somewhat further off? I don't know what the equivalent of height might be, but perhaps strength.
I think you don't distinguish inventions (technology) from discoveries (science) enough. The examples you give are mostly technology, where indeed depletion is less evident, because the fact that new inventions must be more complex (the simpler ones are done) is compensated by the fact that new invention are often a combination of existing elements, and progress make more/new elements available to combine.
For discoveries, it's not exactly the same. Progress is usually a way to make new observations available, but that's only part of scientific discovery: it helps discriminate between competing theories or show that an existing theory is not adequate and maybe hint at possible evolutions/replacements. Progress is like having new ways to reach the frontier faster, but which you also have to learn: having a car will sure makes you faster, but you also spend time learning to drive.
So I think there is indeed a lowing fruit/foraging ground exhaustion effect, especially visible in non-combinatorial fields (like base tech (energy prodcuction for example) or fundamental science (physics is a prime example)
oh I definitely think the low-hanging fruit phenomenon *exists*. But I think there's a countervailing force, of previous discoveries/inventions making new ones possible. I didn't distinguish between the two very much because I don't think the distinction is hugely important - Maxwell can't come up with the equations describing electromagnetism without [consults Wikipedia] Ampere's work on electrodynamics and a hundred other people. (Newton's "if I have seen further it is by standing on the shoulders of giants" quote seems relevant here.)
The empirical question I guess is *how much* the "shoulder of giants" effect counteracts the "low-hanging fruit" effect. My instinct is that it should still get harder, but I know that some people (dunno how to add inline links so apologies for the massive URL: https://deliverypdf.ssrn.com/delivery.php?ID=169000029071117028010108118107125117021004027015095011118103084127066112069079092092012120100121011044030127112082120083080119104074001083064089014029010122030115064002035016104004023010080092021005026090000093022067071027028110002121092079095127106123&EXT=pdf&INDEX=TRUE) think that the apparent slowdown is more to do with bad incentives in science, bureaucratisation, a need to progress in institutions rather than do good science, etc.
There's a theory that new science is apt to be generated by new tools, rather than thinking more about data you've already got.
I think you're right, Tom. I've described a similar model based on mining outwards, where progress exposes new rock here: https://www.lesswrong.com/posts/WKGKANcpGMi7E2dSp/science-is-mining-not-foraging
> Let’s add intelligence to this model. Imagine there are fruit trees scattered around, and especially tall people can pick fruits that shorter people can’t reach. If you are the first person ever to be seven feet tall, then even if the usual foraging horizon is very far from camp, you can forage very close to camp, picking the seven-foot-high-up fruits that no previous forager could get. So there are actually many different horizons: a distant horizon for ordinary-height people, a nearer horizon for tallish people, and a horizon so close as to be almost irrelevant for giants.
Doesn't help that there used to be [a tribe with lots of seven-foot-tall people](https://slatestarcodex.com/2017/05/26/the-atomic-bomb-considered-as-hungarian-high-school-science-fair-project/) but [it has since been mostly exterminated](https://en.m.wikipedia.org/wiki/The_Holocaust).
I have a notion that that it's easier to think deeply when you're sure your neighbors won't kill you.
Well, there's the famous observation that war and strife created da Vinci and Michelangelo, while hundreds of years of peace in Switzerland could only create the Swiss clock
It's from a movie, and I'm not sure it generalizes.
I know a mathematician who believes that military training keeps people from becoming mathematicians, but I'm not sure whether there's solid evidence.
How much personal risk were da Vinci and Michelangelo at? Does it matter whether it's your neighbors or an enemy force?
That's an interesting viewpoint, and I realize I hadn't analyzed my own viewpoint before.
1. Hunter-gatherers were surely always at war and at great personal risk. They didn't have any markable scientific progress for millennia. This supports your point
2. However, the Second World War can be said to be directly responsible for the advent of the computer, atomic bomb, radar, etc.
Is it possible that if hunter-gatherers only had wars every 5 years or so, and not every week, they would put in a lot of resources into developing new weapons, thus heralding the technological revolution at an earlier date? Is it also possible that if the Second World War had been much shorter, say 1 year instead of 6, a lot of the present-day technology would never have been developed? It seems to me that for technological progress, we need urgency in the form of war, but also slack in which we can play around with crazy and unknown ideas to develop the best inventions/weapons.
There's also the question of *refinement*; you can be a Great Genius of your period, but if the technology isn't up to it, then what data you can gather and what experiments you can perform and what working devices you can create are limited.
The Michelson-Morley experiment was important because it disproved the theory of the ether and in turn kicked off research that would eventually develop into special relativity. But there hadn't been sufficiently precise instrumentation to do such an experiment before then; same with measuring the speed of light, etc.
Your hunter-gatherers can be smart and innovative, but there is only so much they can do with their first tools, which have to be worked on to produce better tools, so that more ore can be mined and smelted, and smelting process itself improved, so eventually you can manufacture steel and then you're going places with what you can make, how precise it can be, and how flexible and useful.
The purported lack of genius today may be down in part to something as simple as "we're still working with bronze implements, we haven't even got to steel yet".
There may be a significant emotional difference between being attacked by enemies and genocide by your own (or nearly your own) government. In war, you have enemies. In a genocide, you can't trust your neighbors.
One thing that needs to be explained is that, when Nazi Germany fell, Jews still existed. A lot of the geniuses from the 40s still existed. Maybe it was a specific sort of of schooling that went away-- institutions can be smashed.
It's possible that it's not that the level of accomplishment has dropped, it's just that the publicity machine for calling geniuses isn't working as well.
Or it's possible that a there's a level of trauma which needs to fade.
On the art side, I suspect that a lot of creativity is going into gaming. Someone could be an amazing dungeon master, but their art is personal and ephemeral and they aren't going to be picked out as a genius.
Video editing is a new art form, and it can be popular, but no one takes it seriously the way older art forms are taken.
Leonardo did work as a military engineer and architect for several patrons. We think of him mostly as an artist, but he would have been expected to - and was very much capable of - turn his hand to anything. From Wikipedia:
"Leonardo went to offer his services to Duke of Milan Ludovico Sforza. Leonardo wrote Sforza a letter which described the diverse things that he could achieve in the fields of engineering and weapon design, and mentioned that he could paint. ...When Ludovico Sforza was overthrown by France in 1500, Leonardo fled Milan for Venice, accompanied by his assistant Salaì and friend, the mathematician Luca Pacioli. In Venice, Leonardo was employed as a military architect and engineer, devising methods to defend the city from naval attack. In Cesena in 1502, Leonardo entered the service of Cesare Borgia, the son of Pope Alexander VI, acting as a military architect and engineer and travelling throughout Italy with his patron. Leonardo created a map of Cesare Borgia's stronghold, a town plan of Imola in order to win his patronage. Upon seeing it, Cesare hired Leonardo as his chief military engineer and architect. ...In 1512, Leonardo was working on plans for an equestrian monument for Gian Giacomo Trivulzio, but this was prevented by an invasion of a confederation of Swiss, Spanish and Venetian forces, which drove the French from Milan. Leonardo stayed in the city, spending several months in 1513 at the Medici's Vaprio d'Adda villa."
So there was an amount of risk in his life.
Orson Welles' famous comment on the Swiss could be quantitatively tested.
My impression is that the Swiss are reasonably accomplished, but that, Switzerland lacking huge cities, they typically accomplish their peaks in other countries. E.g., Rousseau became the most famous Parisian intellectual of the second half of the 18th Century, Einstein created the theory of general relativity in Berlin, Euler and some of the Bernoullis spending years in St. Petersburg.
It's a little like West Virginia: West Virginia is lacking in celebrities, but California was full of West Virginian heroes like Chuck Yeager and Jerry West.
Whistler and Welles were talking about art. You may hate the art of the cuckoo clock, but you must respect its engineering.
Another interesting read, thanks! :)
On a single small point : "Since a rational forager would never choose the latter, I assume there’s some law that governs how depleted terrain would be in this scenario, which I’m violating. I can’t immediately figure out how to calculate it, so let’s just assume some foragers aren’t rational".
Isn't there a question of personal preferences and aptitude? Sure, it'd be more productive to go over there but I happen to really like it here and foraging that particular ground makes me feel competent while going over there is arduous for me.
Hence even if it would be more 'rational', I'm not going to do it. 'Irrational' is an acceptable descriptor for that behaviour in economics, but it may not be quite 'irrational' in everyday parlance, it's just optimizing for different objectives.
Let me give an epistemic reason for the stall. There’s a clear barrier to recent progress of the traditional kind, which is (to use the jargon of my colleagues in Santa Fe Institute) complexity.
Complex systems are not amenable to the Francis Bacon style “vary and test” experimental method. We’re learning a huge amount but returns to experimental methods of the causal-control kind are hard to come by. Taleb is a good example of a person — a real, no-BS practitioner — who recognized many of the same things the SFI people did. In a funny way, so was David Graeber.
Examples of complex systems include the human mind and body; hence why we’ve had so little progress in getting control of (say) depression or cancer, and why the human genome project fizzled after we found the 1-SNP diseases. Much of Econ is similar (IMO the RCT era is overblown). ML is CS discovering the same.
They’re hard problems that will require a new set of tools, and even a new Francis Bacon. The good news is that I think we will crack them. We stumbled on this world in the mid-1980s, but IMO didn’t get serious until the mid-2000s.
Agree with everything you say here for whatever it’s worth. We need a new kind of flashlight to see ahead and specifically a flashlight for complex systems.
Didn't investigation of complex systems at least get started when computers reached the point of being able to somewhat handle chaos?
What about education? Think the day away/teleport thing might break on this one. The tribe writes down descriptions of very distant places in very exacting detail and if a student spends ten years studying it they can get there instantly vs say two hundred years if they tried to go it alone. Or do we define the day as what is possible to achieve even with education?
Other interesting thought is artifice. One day the tribe invents a car. I mean that literally in this analogy, although maybe microscope is better. Or stilts or a shovel or something in this analogy? The mere addition of tools that allow you to reach greater depths or heights causes the depleted land to have new bounty. Some of those technologies exist farther away.
I like this a lot overall. I have a similar analogy about light houses I use.
Thanks for the great write-up.
In some sense, a lot of progress in science can be thought of as "getting closer to the truth" than "finding new terrain". "Getting closer to the truth" comes from "change of perspective". This change of perspective mostly comes from new technology or observations, like the Morley-Michelsen experiments, which gave rise to Relativity, or other experiments that led to Quantum Physics. The age of the scientists is generally irrelevant. Physics was many hundreds of years old when Einstein and Dirac, young scientists, made their discoveries. Although they may in themselves be giants, it is difficult to argue that such giants don't exist at all today in terms of sheer intellect and hard work.
Hence, I feel that point no 5 and confirmation bias can explain a lot of this. People learn a paradigm, and try to stick very hard to it, until new technology makes experiments possible that clearly contradict those paradigms, causing paradigms to change. The first scientists to then discover those changed paradigms that accommodate the new experimental results become heroes.
Let's revisit this issue when you've got more data about the scientists, so that you can concentrate on that instead of elaborating the already-clear forager metaphor and then shrugging your shoulders over the real question.
This is really pessimistic without the last part - that at some point, the foragers manage to set up camp in another part of the forest, acquiring untold riches at first, then letting others set up even further.
This is what happened with machine learning, with biotech (next generation sequencing, anyone?), in fact a lot of science is about this kind of camp-setting. "Standing on the shoulders of giants", and it's giants all the way down/up.
There is a huge difference between having to figure out calculus from first principles, and learning it in high school then moving on to something cooler. And then you can have the computer calculate your integrals for you, with calculus relegated to your "maybe figure it out someday when it's needed" pile. Knowledge is a tool for acquiring further knowledge.
As I say in the original essay on genius, I think it's true that "ideas are getting harder to find" (what you call the "Low-Hanging Fruit Argument"). It's also empirically supported by looking closely at things like agricultural yields. The question is just whether it fully explains the effect, or even most of the effect, and there are reasons to doubt that. For example, the two reasons I give in the original essay to be skeptical are:
(a) if the lack of genius (or let's just say "new ideas") is due solely to ideas getting harder to find, then it is an incredible coincidence that, as the effective population of the people who could find such ideas exploded to essentially the entire globe (with the advent of the internet and mass education), ideas got harder to find to the exact same degree. In fact, this looks to be impossible, for there should have been "mining" of the idea space in order to quickly exhaust it, and which would have triggered a cultural golden age. It is on this question that the original essay starts, but I've never seen anyone address how changes in effective population should have led to more "finding" and that doesn't look like what we see.
(b) “ideas are getting harder to find” seems especially unconvincing outside the hard sciences in domains like music or fiction. I actually still think there is some truth to it - you can only invent the fantasy genre once, and Tolkien gets most of that credit. But overall it seems obviously true that something like fictional stories aren't as directly "mineable" as thermodynamical equations. And yet, again, we see the same decline in both at the same times, so the explanation needs to extend beyond the hard sciences.
How much of the entire globe has supporting infrastructure (physical and cultural) that makes these places a viable location for research? If we moved a researcher from Switzerland to Kenya or Bangladesh, how would it affect their output?
Surely it might, but perhaps less than one might expect - there are some great universities in India! But consider just within the US: it used to be that only the kids at MIT got the MIT lectures. Now *anyone* can get the MIT lectures. It used to be that only the kids at Harvard got access to the best math teachers. Now there's Khan academy. Most scientific papers can be found online, even if you don't have institutional access. There's thriving intellectual communities on blogs and in forums and places to post contributions at zero-cost, if you have them to make. Not to mention the mass education - just look at how many new foragers there should be now that racial and gender barriers have been significantly decreased and college is basically mandatory for a huge swath of Americans! An explosion of foragers, and the ease of foraging massively increased, all within a few decades. And yet, and yet.
Khan and MIT lectures are great fallback resources, or adequate resources to get an understanding somewhere between a high school student and a really bad undergrad.
I tried learning molecular biology as an amateur before actually returning to college for it (perks of the free, if substandard, education in my country). Maybe I could pull it off if I was a supergenius, but realistically I'd have an extremely fragmented and ungrounded understanding. So for my first contribution to the field, I'd need to:
- read papers en masse where the only layman-accessible words are prepositions
- understand the context of those papers to know why they're doing what they're doing, and why it's important for the field, and why couldn't it be done in another easier way
- identify limits of the state of the art
- come up with a good idea for a novel contribution, by myself, without asking people who already grapple with these problems
- find collaborators and convince them my idea is good and I know my shit
- write the paper
- get it accepted in a journal, or at least self-publish on biorxiv and make a considerable number of domain experts read it
God forbid I need actual wet experiments and funding - that's just the list for pure in silico work!
At any point I can make mistakes - I don't have any experts to look over my shoulder and tell me when I made a mistake. I'd spend years of my life working on a project that is completely irrelevant and either someone did it better five years ago, or it's just something that nobody does because there is no point to it.
>- get it accepted in a journal, or at least self-publish on biorxiv and make a considerable number of domain experts read it
Isn't this exactly the problem caused by greatly increased amount of people involved in science? Too many people publishing so much that nobody can read it all and judge it on its merits. Consequently everyone falls back into social games of knowing the right tastemakers and having access to the right connections. Instead of a democratic/capitalist marketplace of ideas, it becomes an aristocratic society reminiscent of an early Victorian novel. ("Aristocratic" intended as a slur, not its literal meaning.)
A century ago a scientist could have a very good grasp of their field by reading a handful of journals relevant to their discipline. Two centuries ago, you'd do well by reading a single journal (Philosophical Transactions).
"An explosion of foragers, and the ease of foraging massively increased, all within a few decades. And yet, and yet."
Okay, I'm gonna bite here. Take one of your geniuses of the past, plonk him down today, and see if he still counts as a genius. Strip away the mythos of Einstein, take his work from youth to maturity, and compare it with people working in the same field today.
Would Mozart be considered a genius? Maybe. Or maybe he would go into composing movie scores which is lucrative and steady work, and nobody would talk about him as a great genius, even if he radically transformed how movie scores are written and used.
The mark of the past genius should be that they could still come up with novel concepts and discoveries even after absorbing all the progress made in the field since their day. But could they? Would Einstein be able to leap ahead with new ideas, or would he have reached his limit at what are now "yes, this is part of the field everyone has to know" ideas?
I do think there are natural human limits. It may be that we are hitting up against them, and that the intellect sufficient to be a revolutionary genius in the low-hanging fruit days is not sufficient to be the same in days of scarcity of fruit. It could well be that in ten years time somebody comes up with something new and unprecedented which is indeed unforaged territory, and the corresponding Great Men will flourish in that field. Giving up on "where are all the geniuses?" right now seems premature to me.
"there should have been "mining" of the idea space in order to quickly exhaust it, and which would have triggered a cultural golden age."
What makes you think this is not a cultural golden age? How would you define a cultural golden age? As we are constantly reminded, no other society in the history of the planet has been as rich, educated, and living good lives as we are. We have everyday devices that were the stuff of science fiction even thirty years ago, and our poor can access some model of those. We have the Internet, we have access to more information quickly, easily and cheaply than any culture before has ever had. Diseases and health problems that would have been death sentences are now curable with a course of pills or surgeries. Ordinary people have access to resources to be creative that even the great craftsmen and artists of former times could not have dreamed of.
The complaints seem to be along the lines of "where are our colonies on Alpha Centauri?" which, when you think about it, can only be the kind of complaints from a rich, successful, technologically advanced society accustomed to constant progress.
(I'm not saying we are in a cultural golden age, just that golden ages tend to be recognised by looking back to the past and saying 'ah yes, that was a time of wonders'. What will our descendants a hundred years from now think - will they talk about our golden age?)
I don't think most people credit Tolkien with "inventing" the fantasy genre since most everyone knows it existed before him. It's just that nearly everyone since has been writing in his shadow.
So far as I know, Tolkien didn't invent the fantasy genre. He invented, or at least popularized, serious world-building, which has become a dominant part of fantasy.
"you can only invent the fantasy genre once, and Tolkien gets most of that credit"
William Morris would like a quick word:
https://en.wikipedia.org/wiki/The_Well_at_the_World%27s_End
It depends on what you mean by "the fantasy genre". I believe that fantasy is a human norm for fiction, and if anything was invented, it's the idea that respectable adults shouldn't want fantasy.
And then.... there was Tolkien and Star Wars and the return of the repressed desire for fantasy.
The low-hanging fruit argument seems to me very probable, and I love the metaphore with real fruits in it!
I would like to add a small (and quite optimistic!) additional hypothesis concerning the decrease of the observed frequency of geniuses, this one in relation with the increase of the population and its level of education.
If we assume that we recognize someone as a genius when he or she clearly surpasses all the other people in his or her field, that it is therefore mainly an evaluation that is relative, being done by comparison with what other people produce at a given moment in the field in question. In this case, the fact that the population as well as its level of education is increasing must also very significantly increase the number of people working in a field . And in this case, it seems to me that statistically, the probability that the most talented person in a field is much more talented than the second most talented person in the same field, is probably much lower than before.
Therefore, we would have difficulty recognizing contemporary geniuses partly because there would be many people doing extraordinary things in general, whereas before there were a few who stood out.
I like the rough model but i'd point out that there are certain topological assumptions being made, which maybe don't apply. If 'places of insight' were arranged in some Euclidean geometry, then your theory holds.
But if we generalize it to "finding new knowledge require _either_ walking new ground, or exceptional talent" (which i think is totally fair), we might ask whether it's possible to walk new ground via nontraditional approaches. If the _only_ dimension we consider is 'angle and distance from the base camp', i.e. the territory is 2-d euclidean grid, and we've mapped out where everyone has and hasn't walked, then it becomes much less likely you will _find_ new ground immediately around the camp.
But if the number of dimensions is so high that most people don't even _see_ a bunch of dimensions, then we might actually expect _creativity_ to lead to insights more readily than intelligence.
Or, if technology+economics have changed in such a way that someone might have 10 different mini-careers and still acquire sufficient wealth to do as they please , this might _also_ be 'new territory' where discoveries become easy. So we might expect future discoveries to be more likely from, say, a startup employee turned venture capitalist turned armature horticulturalist turned poet turned botanist, who synthesized a bunch of experiences that many other people had, _individually_, and yet nobody had yet had _collectively_.
The fruit-gathering analogy might work if someone is the first person to circumnavigate the camp at a specific radius, and to spend at least a few weeks at different angles at different times of the year. They might notice some seasonal continuity between plants growing only at that radius, which might only be observable to someone who had spent the right amount of time in all of those places. In terms of ground, they haven't covered anything new. But if we include time in there, then yes, it's like they _did_ walk on new territory.
So i like the theory if we generalize it as "to maximize your chance of discoveries you have to walk on ground nobody else has walked on before", but it's worth asking whether "the space of being an academic researcher" being extremely well-trodden means that there aren't low hanging fruit in dimensions none of us have even considered looking in.
Like, for all we know, just breathing weird for like 7 years straight could let you levitate and walk through walls. How would we know if this were true? Suppose someone discovered it 10,000 years ago, and they did it, and everyone was like 'holy shit that's crazy' and they wrote stories about it, and today we dismiss those because they are obviously absurd. Are _you_ willing to spend seven years chanting some mantra on the off chance that maybe it'll let you walk through walls? I'm not. Probably most reasonable people aren't. That's some unexplored territory right there! But something tells me it probably isn't' worth the effort.
And yet people like wim hof exist. This tells me there's probably a ton of low hanging fruit still around but it'll be discovered by eccentric weirdos.
Is there any reason to assume science fruit space is even Euclidean?
As players of https://zenorogue.itch.io/hyperrogue know, in hyperbolic space you can fit huge worlds so close that you'll get from anywhere to anywhere else in a few steps. The problem is just knowing the way.
I'm not sure about "taking more time to reach the frontiers of knowledge". Bachelor's degrees haven't gotten steadily longer over time, and previous key discoveries get built into the curriculum. The length of postdocs has (particularly for those eyeing an academic career), but that has more to do with the often enormous quantity of work required to get your Nature Something "golden ticket" paper. Once you start grad school you're basically teleported to the frontier. People learn and adapt quickly.
I think genuine breakthroughs happen on a more regular basis than people think, but we've pushed the depths of knowledge so deep that they're not necessarily recognizable to an outside observer.
I’m not sure about other fields, but I can say that it definitely takes another 2-3 years to hit the frontiers after a BA. This might vary depending on your subfield, but in analysis it definitely can
I really enjoy analogies so thank you for writing up this very thoughtful and entertaining model. I think there's another thing at play, which is distraction. I'm not as talented a writer as you, so instead of clumsily trying to extend the analogy, I'll tell a some stories about my own medical school class.
I went to a well regarded medical school with lots of brilliant and talented classmates. I will say there was a big difference in how much each of my classmates were excited by discovery, and interest in discovery was largely orthogonal to pure intellectual horsepower. Some of the smartest people I've ever met had exactly zero interest in research--they jumped through the appropriate hoops to get residencies and fellowships in lucrative fields and now enjoy lives where they are affluent, enjoy high social status, and do work that they find interesting enough. I think some of these folks are "potential geniuses" who made a rational choice to take a sure thing (a career as an orthopedic surgeon), over something more volatile (a career doing research in the life sciences).
To give an example of the same effect, working slightly differently, a friend of mine told me that he had taken a job as an investment banker right after college, and then was laid off before he could start working due to the financial crisis. He came to medical school as a backup plan, and is now an extremely talented epidemiologist.
Final story is about a friend who, while he was a post-doc (MD PhD), realized it made much more sense to moonlight as a doctor and pay other post-docs (who were PhDs and didn't have the more lucrative option of taking care of patients) to execute his experiments for him. This was kind of a boot-strappy way of leveraging the resources around him. But I tell this story is because he had to make science more a passion project funded by his actual lucrative career which was as a physician.
What I take away from these stories is three things:
1. It doesn't really make a lot of sense to study the sciences (especially at a fundamental, basic level that is most likely to create groundbreaking discoveries) if what you care most about is a comfortable or happy life. True, the rewards are enormous for the right-most outliers, but most people work very hard for tiny material rewards, when they're usually clever enough that they could have nicer lives in most other careers.
2. Having a successful career as a scientist is HIGHLY path dependent. You have to have the right sequence of experiences that give you more and more momentum, while also not having experiences that pull you off your scientist path onto more lucrative or comfortable paths. This is a point that's been made MANY times before but I wonder how many potentially great thinkers over the last 30 years have pursued careers in management consulting, banking, or dermatology, or orthopedic surgery. Obviously these people still make potentially great contributions to society in the roles that they take but they are much less likely to expand the frontiers of human knowledge.
3. We still probably undervalue most research, as a society. Because the potential payoff is so uncertain, individuals have to bear a lot of the risk of these careers. There's an enormous opportunity cost to getting a PhD and doing a post doc, and even if you are one of the few successes that gets your own lab, it's still a not a very materially rewarding situation. So what you end up with is a) a lot of talented people who bail on their science careers for things that are more of a sure thing and b) a lot of people who never consider a science career because it represents a high-risk, low-reward scenario compared with the other options in front of them.
For the sake of argument, let's grant that your argument as presented is 100% correct. Even so, outsized focus on the political aspect is right and proper because unlike the mechanical causes we have some small hope of changing the politics. Instead of "there's no mystery to explain here" the takeaway could be "we need to run a tighter ship of Science, the deck's stacked against us".
Perhaps a more apt analogy for science is not picking fruit, but planting fruit trees. Planting a fruit tree suggests a scarce return in the short term, but the returns can expand organically in two ways: as the tree grows, and as the seeds from the tree spread to sprout other trees. So, a single planted tree has the potential spawn an entire ecosystem. Similarly, knowledge begets knowledge.
Excellent point!
Nice.
"machine learning should have a lower age of great discoveries."
Possibly controversial opinion but machine learning is a technological field, and not a scientific one... or rather -- none of the science is novel. The advances in machine learning are a combination of the scale afforded by modern hardware, vast amounts of data, and statistical and curve-fitting theories that have been around forever. The big issue with regarding it as a scientific field (for me) is that they aren't coming up with new principles as such, they're coming up with a set of techniques to accomplish tasks. And in general they have no idea how these techniques actually accomplish these tasks -- the loop is generally suck-it-and-see; hence all the pseudoscience surrounding it and baseless claims that brains work like neural nets, or that sexual-reproduction works like drop-out, and so on.
Another factor is that to make a discovery in machine learning, you need to spend a lot of money on compute, and a lot of money on data (or have an agreement with some company that already has tonnes of it) -- so this also favours established people.
Finally, advances in machine learning are consistently overstated. GPT-3 already absorbs more content than any human has ever absorbed; and people are amazed that it can muddle through tasks that are simple for a human child with a fraction of the compute, or training data. Also, there's a bit of Emperor's Clothes about this stuff. One of the useful things about human cognition is that you can tell a human "hey, there's this interesting thing X" and the human can quickly assimilate that into their model and use it. For example, I can give you a slightly better method for multiplying numbers, and you can apply it pretty instantly. This is what "learning" usually means for human cognition. You can't explain to GPT-3 a better method of multiplying numbers. And there's no mechanisms on the drawing board for how to do it. Sorry this is a bit of a rant, but in my real life I'm surrounded by people who think GPT-3 is basically a human brain and it drives me nuts.
I think you need a different model for science and technology? As you say, physics seems to have stagnated, but our technology continues to advance; cosmology continues to advance, but space faring technology regresses; scientific knowledge about birth advances, but outcomes of birth in terms of morbidity and cost declines. For software and engineering, the science for which continues to advance, but the technology declines (see Collapse of Civilization by Jonathan Blow).
The area available to forage, is (pi R squared).
I'm pondering if this is related to scientific discoveries too. Since geography varies, and science fields vary also, I think there's some merit here. One forager my specialize on muddy seeps, whilst another may focus on the banks of larger rivers, another robs the nests of cliff dwelling birds. Each would find different resources, one comes back with a fish, the other with cattail roots & duck eggs, another with swallow eggs and nestlings. Likewise in science, someone plays at melting things in the furnace, someone plays with light and lenses, another ponders infinite series.
"Some writers attribute the decline in amateur scientists to an increasingly credentialist establishment"
I suspect that one reason for the credentialist establishment is that it takes many years to reach the state of the art in knowledge, and non-rich people can't afford to spend that many years studying rather than working. The longer it takes to reach state of the art, the more money has to be spent getting that student to the state of the art, and the greater the need for a bureaucracy to decide who gets it and who doesn't - and bureaucracies run off credentials.
One reason I think that the UK is overrepresented in scientific research is that our education specialises earlier than most other countries, which means that, at the expense of a broader education, Brits can reach the state of the art several years earlier than Americans (the average age at PhD is 29 vs 33).
If true, this is an excellent argument for letting gifted kids specialize earlier, while the whole educational community is pushing for a longer period of general education. There are other considerations here - maybe children with a more general education are more likely to lead happy lives and it's worth sacrificing a few potential geniuses to the gods of mediocrity to make that happen.
But if so that just brings us back to Hoel and the idea that an education that is personalized and one-on-one is just vastly superior to our system at cranking out revolutionary thinkers.
I guess if you find yourself burdened with a precocious progency the strategy is get 'em young, find someone who can cultivate their strengths, and try to keep the truancy officer away long enough that they aren't forced to spend 6 hours a day proving they're reading books they already read.
Mostly off-topic but fun and kind of instructive game: Imagine what a modern education looks like for geniuses of the past. What was Oscar Wilde's mandatory elective? What does a paper about the Themes of the Scarlet Letter with an introduction paragraph, at least three body paragraphs, and a conclusion paragraph look like if written by Newton or Einstein?
"What was Oscar Wilde's mandatory elective?"
Classics, if this Wikipedia article is correct (I read some anecdote years ago about one of his Trinity tutors saying, about Wilde's success in England, "Yes, it was better for Oscar to go there, he wasn't quite up to the mark here"):
"Until he was nine, Wilde was educated at home, where a French nursemaid and a German governess taught him their languages. He joined his brother Willie at Portora Royal School in Enniskillen, County Fermanagh, which he attended from 1864 to 1871....He excelled academically, particularly in the subject of Classics, in which he ranked fourth in the school in 1869. His aptitude for giving oral translations of Greek and Latin texts won him multiple prizes, including the Carpenter Prize for Greek Testament. He was one of only three students at Portora to win a Royal School scholarship to Trinity in 1871.
Wilde left Portora with a royal scholarship to read classics at Trinity College Dublin, from 1871 to 1874, sharing rooms with his older brother Willie Wilde. Trinity, one of the leading classical schools, placed him with scholars such as R. Y. Tyrell, Arthur Palmer, Edward Dowden and his tutor, Professor J. P. Mahaffy, who inspired his interest in Greek literature.
...At Trinity, Wilde established himself as an outstanding student: he came first in his class in his first year, won a scholarship by competitive examination in his second and, in his finals, won the Berkeley Gold Medal in Greek, the University's highest academic award. He was encouraged to compete for a demyship (a half-scholarship worth £95 (£9,000 today) per year) to Magdalen College, Oxford – which he won easily.
At Magdalen, he read Greats from 1874 to 1878, and from there he applied to join the Oxford Union, but failed to be elected.
While at Magdalen College, Wilde became particularly well known for his role in the aesthetic and decadent movements. He wore his hair long, openly scorned "manly" sports though he occasionally boxed, and he decorated his rooms with peacock feathers, lilies, sunflowers, blue china and other objets d'art. ...Wilde was once physically attacked by a group of four fellow students, and dealt with them single-handedly, surprising critics.
...In November 1878, he graduated with a double first in his B.A. of Classical Moderations and Literae Humaniores (Greats). Wilde wrote to a friend, "The dons are 'astonied' beyond words – the Bad Boy doing so well in the end!"
Classics are a good choice for a writer/artist - My high school offered band, orchestra, drama, show choir, and home ec. I was not an exceptional student and it wouldn't have occurred to me to ask for a classics elective, but if some incredibly talented young person had, I suspect the folks there would have needed to look it up.
"the strategy is get 'em young, find someone who can cultivate their strengths"
How young are we talking, and how specialised? Suppose little Johnny is good at maths, so you identify that as where he has the potential to excel. So you steer him along a path leading more and more to specialisation in maths, and prune away any extraneous subjects. And sure, he ends up excelling in maths - but there's an amazing breakthrough in biology he could have made, except all that was pruned away early in favour of keeping him on the maths track.
The Polgar sisters are chess prodigies, but could they have been doctors, musicians, engineers? We don't know and are unlikely to ever know, because while I don't think their father isolated them with chess alone, that was where the positive reinforcement came in. Doing well at something else was praised, but doing well in chess was where the most attention and most celebration and most reinforcement happened.
What way would they have turned out with a more general education? Would they have been prodigies in a different area, if left to natural inclinations? That's not a question we can answer, but I do think it needs to be asked when we're talking about steering kids to specialise in one topic over another.
Entirely agree - I'm currently expecting and know I am not the sort of person who will be able to steer my own child into doing one thing their whole lives without giving them input. I *do* think this means that I will not raise a "person who gets to be in the history books" level genius, but most "person who gets to be in the history books" level geniuses I read about turn out to have miserable personal lives and feel bad about themselves forever. And an unusual number of them turn out to have serious issues with their fathers so...
> I can’t immediately figure out how to calculate it, so let’s just assume some foragers aren’t rational.
I can't resist that remark. Nerd snipe succesful. First, short answer: After 9 hours of travel, you only have half the time to forage, so you should be able to gain twice as many points per time unit to compensate for that. So if the area at 6 hours distance is 50% depleted, the area at 9 hours walking distance should be 100% virgin area to get the same expected total value. Only after the depletion level in all areas within 9 hours walking distance increases does traveling further become worthwhile.
Compare with the early explorers: Nobody will travel when the area at distance 0 has full value; it is only after the depletion level at close distance starts to become noticeable that people will decide to venture out (and even then, they'll travel as little as possible if they want to maximize their gain).
More general computation. Assuming we are in a state of equilibrium, let D(x) be the depletion level at x hours from camp. Then after walking x hours and gathering for 12 - x hours, you gain (12-x)*100*(1-D(x)) points. In a state of equilibrium, this should be constant, so (12-x)*(1-D(x)) is constant, say C. Then 1-D(x) = C/(12-x), i.e. D(x) = 1 - C/(12-x). Given the assumption that D(6) = 0.5 (you need to make an assumption somewhere), you find C = 3 and hence 1 - D(9) = 1. When traveling further, you'll find that D(x) becomes negative, i.e. the area should have expected value more than 100 points per hour to become worth traveling to. In a model where D(x) must be between 0 and 1, you'll find that the depletion level will gradually decrease as you travel further until you reach D(x) = 0, at which point exploring further does not gain you anything. Note: D(x) = 0 when x = 12 - C. You can measure C by checking the depletion level at any point where people do forage; for example, if the area at 0 hours distance is 95% depleted, then 1-0.95 = C/12, so C = 0.6, and people will travel as far as 12 - C = 11.4 hours to forage for 0.6 hours; 0.6*100 = 5*12 points. Chances are that far before this point it'll become valuable to invest in ways to travel further.
Wouldn't the foragers just move camp? If it takes nine hours to get from present campsite to new virgin area, isn't it simpler to pack up and move the entire camp closer, rather than spend the majority of time travelling to and from the new site, with less time to forage?
Agriculture would be different, as you are rather tied to one area (you can't just pack up an entire farm). Even there, it's easier to drive out your animals to graze in a good area and then drive them back in the evening to be milked, and if it starts to take too much time to travel to and from the pastures, you do things like let the sheep roam free on the mountain and only bring them down at certain times of year (a bit tougher with cattle, but transhumance is definitely something: https://en.wikipedia.org/wiki/Transhumance).
I'm not entirely sure where I'm going with this, mostly that the analogy breaks down for me here. If your foraging grounds are further and further away, you can easily pack up and move closer. You can't quite do the same with fields of knowledge, because you do have to traverse the already-covered ground to give you the background you need before you can get to the new, untapped area.
Oh, absolutely, in my last line I was mostly sticking to the existing analogue that I think Scott intended, in which I interpret the twelve hours roughly as the human lifespan, traveling as time spent learning the basics to be able to understand an unknown field of knowledge, and foraging as actually doing research in said field. In that case 'moving the camp' is (probably) not an option (we all start from birth), but finding ways to travel further might be possible (be it improved education (faster walking speed and/or more efficient foraging), intelligence enhancement (same), lifespan extension (more hours to travel), or [insert favorite way to increase the amount you can learn]).
Sticking to the actual foraging story, I agree that moving camp would in most cases be more sensible than investing in maximum travel distance (especially if you actually manage to reach 95% depletion of the nearby area).
> you can easily pack up and move closer. You can't quite do the same with fields of knowledge, because you do have to traverse the already-covered ground to give you the background you need
The verb in the metaphor should be “assimilate”
not “consume”
perhaps.
I don't think this adequately explains why people who made great discoveries when they were young in 1900 didn't increase their rate of making great discoveries when they were old and bring the average up. One needs to explain 1900-people losing discover-ability as they age, but 2000-people gaining it.
One unmentioned thing can help explain this: extension of healthspan. The mind is the brain is just an organ in the body and if the body is generally dysfunctional the brain will probably not be in best condition either. Being in great health instead of poor health probably at least dectuples the probability of some great discovery. The age-related cognitive decline curve probably shifted a lot due to the extension of healthspan.
Plus, treponema pallidum has less chances to wreck the brains of geniuses nowadays.
"One in five Londoners had syphilis by age 35 in the late 18th century, historians estimate."
I think there's something foundationally missing from this model. Very specifically - what about cranks and weirdos who were retroactively not cranks and weirdos?
More specifically - all of the computer science greats (Dijkstra, Turing, etc) all did their foundational *mathematical* work well well before they were household names (at least among people who are relatively intelligent and science-aware).
There's a great revolution that happened around 1980 that suddenly made computer programming, computer software, and thus *computer science* and all of its offshoots - massively more high-status and important because the Fast Transistor Microprocessor was starting to allow more and more things to use Software.
Without the Fast Transistor Microprocessor - none of that work would be lauded as the genius at it is (Turing, rather famously - went to prison) and would instead be an esoteric curiosity for mathematicians.
I get the feeling that with the amount of Science Infrastructure we have in place today, absent some New Technology or Paradigm that enables a lot of work that was done previously to matter in a new way, or enables new work - most people seeking truth are going to be happily chipping away in the truth mines for proofs or evidence for esoteric effects that aren't super relevant to today. We will lament their lack of progress in world changing and truth seeking for decades.
Suddenly - something will change, some new technology will get invented, or some new mathematical, computational, or scientific tool will become widely known or available, and suddenly proofs from 1960s mathematicians or earlier are no longer esoteric curiosities - they're the world-shaking foundation of the modern universe.
I keep thinking about the time period of the Buddha, Jesus, and Mohammed. (I know that's quite a range of time, but in the course of human history, it's not so much.) Was there just a sweet spot around then for religions? Like, there was enough 'ambient philosophy' around that new and compelling religious discoveries could be made? (Although it's not what I actually believe, for this purpose assume by "discovering" I mean to say, there are certain religious ideas that can be dreamt up that enough people will find compelling that they can gain a real foothold. Discovering is finding one of those ideas.)
Arnold Toynbee thought a lot about this in his “study of history.” He postulated that as civilizations atrophy and then decay, the period of struggle during the long collapse often gives rise to a spiritual upheaval leading to a more enlightened religion. Speaking in broad strokes, Toynbee sees 3 major cycles of civilization (we are in the third), with each time of struggle spawning a more advanced spiritual state (e.g. Baal worship - Judaism - Christianity, with parallels in Asia and India leading to Hinduism and higher Buddhism). This new religion then goes on to define the successor civilization in Toynbee’s model. The time period you are talking about would be for him the struggle period of the second cycle of civilization.
Of course Toynbee is all but cancelled and forgotten these days for his taking religion and spirituality seriously in his historical analysis, and for his sweeping narrative approach which went afoul of postmodern historiography.
If he is right, and if we are in the struggle phase of Western Civilization (debatable), the question is what new spiritual system will arise from the ashes of the West. I feel like that is at least related to your question.
I have a general conviction that hard polytheism is outcompeted by the alternatives because it doesn't hold up to scrutiny or really offer meaning or answer any important existential questions.
But in your comparison, I'd probably nix Mohammed and think instead about Zoroaster. Historians don't agree when he lived -- close to the reign of Cyrus the Great or 1,000 years before? But if we speculate it's the former, then a certain "Age of the Prophet" can start to be seen, centered on Persia and the lifetime of Cyrus, and probably ending with Mani. Cyrus ended the Jews' Babylonian Exile and began the Persian conquest of modern-day Pakistan, on Buddha's doorstep and near-contemporaneous with Buddha's life. He also came towards the end of the age of the Old Testament prophet, though a handful were post-Exile.
Now, as a Christian I'd argue that the prophet of that age was a sort of God-given Jewish "technology" that spread to Persia and finally India, but non-Christians will generally argue the reverse and that Second Temple Judaism imitated Zoroastrianism. I think this is mostly a faith judgement either way.
weren't early scientists amateurs because science wasn't a profession you could earn a living in?
Steven Johnson has a similar concept he explains in his book, Where Good Ideas Come From: The Natural History of Innovation, called "the adjacent possible." His analogy is that every new discovery opens a door into a new room which contains yet more doors. Each new discovery opens paths to new discoveries.
I have been a bit confused by the premise of this conversation on genius and the perceived implications (concern?) that it seems to be bringing up.
My (oversimplified?) understanding of Hoel's original piece:
1. The world use to product "geniuses" (towering giants in a single field or multi-disciplinarians who made large contributions across many fields). Some of them even made their contributions in their spare time!
2. We don't do this any more
3. This is bad/concerning
4. How can we solve this problem?
5. Aristocratic tutoring?
Isn't this essentially specialization playing out? The reason this doesn't happen anymore is the comparative advantage even for people with the same natural talent as past geniuses is more than overcome by the specialization that is required to make a contribution in nearly all fields. Instead of being a problem, isn't this a natural consequence of all of the efforts of those who came before? As Scott's analogy is pointing out, hasn't all of the low-hanging fruit been picked?
That strikes me as a much simpler answer than a lack of aristocratic tutoring.
Interesting article. I think one element this fails to take into account is the general category of surprise/accidental discoveries. Like Kuhn's paradigm of scientific revolutions on a small scale.
To put that in terms of your example: What if one day little Jimmy the forager trips and and lands face first on a rock and realizes it's edible. It doesn't matter then if he is experienced, smart, old or young.
Scientific progress is not necessarily linear?
I think the conceit that knowledge is dimensional is flawed in a number of ways, not least the ways others have already brought up, such as that historical ideas make entirely new ideas possible.
I'll observe that somebody (Cantor) invented set theory. He didn't find a new space in the territory - he created new territory out of nothing.
I agree, new territories get created from time to time.
Sounds solid to me. But I'll nitpick against the claim that `physics is stagnant.' This is arguably true for high energy physics, but physics as a whole remains vibrant, largely by virtue of constantly inventing new subfields (which open up new foraging opportunities). See my DSL effortpost on the topic here https://www.datasecretslox.com/index.php/topic,3007.msg91383.html#msg91383
Per the typology I propose in that effort post, you can divide up physics into six major subfields, of which only one can really be argued to be stagnant. That subfield only accounts for ~10% of professional physicists (according to statistics from the American Physical Society), although it might be more like 99% of `physicists that talk to journalists.'
LOL! I agree, physics as a whole is not stagnant.
I have enjoyed Scott's whole collection of posts around research productivity. I want to throw in another ingredient that I think should get more attention.
In most fields, having a research-focused career has gotten dramatically more competitive over the last generation or two. Intense competition can help motivate people to work harder and reach further, but it can also stifle creativity. I'm specifically thinking here about the need to publish and get grants, and how in highly competitive areas it's easy to shoot down an application or manuscript due to some weakness, even if there's something really interesting in it. It's super-extra-hard to come up with brilliant new vistas to explore when you simultaneously have to defend against a hoard of maybe-irrelevant criticisms.
If this dynamic is important (not sure if it is), the only way I see to address it is to somehow collectively limit the number of people who have research careers.
Maybe amateur scientists are less common because our true leisure class is smaller? Even the children of oligarchs like Trump's kids pretend to flit around doing some kind of Succession thing, where in the past it was totally normal to own enough land to support your lifestyle and then go off on a hobby forever.
You are never supposed to forage in the immediate vicinity of your camp. The area around your camp should be left alone for emergencies.
machine learning might be inherently complicated a subject.
Evolution / Darwin took many years because there was so much work involved.
physics might have essentially been less complicated in 1920s.
generally, there is no clear formula for how complicated a field is Which is less related to how old it is
And, of course, Darwin had several centuries of naturalistic observations and descriptions of life forms and life ways to build on. Without those, he'd have had little on which to base his grand theory. The theory is built on centuries of descriptive work.
i tend to respect my doctoral advisor, and that means i tend to respect the economists he respects, including his doctoral advisor and (presumably) the economists his doctoral advisor respected, etc.
what if "geniuses" are just the genghis khans of science?