237 Comments
Comment deleted
Expand full comment

Same with authors. It seems the more experience an author has, the better and better their stories become. Just look at the biggest names in literature, going back centuries, and they’re all older.

Expand full comment

I don't think the complexity of music distinguishes composing music from composing fiction. It doesn't take 50 years to learn how to craft a good sentence, but crafting a good sentence is to writing as being able to use a scale, a key, or a chord progression is to music. I suspect the two arts are quite similar. Both vary in time along one dimension, so they share structure in a way neither can with painting or sculpture. Novels and movies have the same type of deep recursive structure, and use of motifs and callbacks, that symphonies do. Both arts also use unexpected subversions of the standard structures to create startling emotional effects. I've begun planning some blogs on analogous techniques in music and fiction, but need more examples to make a good series.

Expand full comment

An interesting example of the "associated with facing mortality" notion that you mention is of course Schubert, who died at 32 and toward the end of his short life wrote some of the most highly-regarded works in the canon (I'm a big fanboy myself and if we had to rank everyone I would put him alongside Bach/Mozart/Beethoven in the top tier, but I think the general consensus on him is still very positive). He was aware he was dying (of Syphilis) and it seemed to have motivated him to be exceptionally productive, so one could argue that he's a data point in favor of "facing mortality" being the driving factor for composers more than age per se.

On the other hand, his younger works (before he knew of his impending death) are still quite good, if less consistently so and probably not quite reaching the heights of the late quartets, the C major quintet, the last 3 sonatas, Winterreise, etc. And of course he's only one data point; I presume plenty of lesser-known composers have presumably died young without it provoking sudden genius.

Expand full comment
Comment deleted
Expand full comment
author

You can see the study here ( https://www.bmj.com/content/357/bmj.j1797 ) - sorry, the link was broken before. It looks like they're looking at hospitalists only (so not specialists); although they don't explicitly say their study is limited to attendings, they must be, since they use "number of years since completion of residency" as their measure for years of experience. My own experience in hospitals is that most patients are assigned somewhat randomly to attendings based on who is on call or otherwise taking new intakes on the day the patients come in.

Expand full comment

I think "most people aren't trying very hard" explains a lot of plateaus.

Also, there's a very unclear path for defined skill improvement past a certain point.

Once you're an Attending (eg, fully grown / independent) physician with a few yrs of independent practice, who can handle the mundane, the uncommon, and the very uncommon stuff with ease, you only rarely encounter new enough information to really require you to learn new information.

This nearly fits with neural net knowl eventually plateauing if you don't feed in any new information.

You need new stuff to update on!

How many new opportunities for learning does a 40 year old attending have?

They need to update themselves on latest practice guidelines and new diagnostic tools but that doesn't really compare to the amount of learning you do as a trainee.

Requirement to learn a lot simply isn't there.

Expand full comment

Also, I think you can pretty quickly get some intuition on learning interference, interestingness of info, and max daily learning rate by trying to study huge amounts of info through spaced repetition.

Eg, while studying for step 1, i learned that somewhere around 150 to 200 new cards per day of medical information was about my sustainable maximum, if I was doing that all day. And I could do something like 1200 reviews of old cards in addition. After that learning became extremely aversive, which is not the norm for me and also very inefficient. That was way higher than I had tried before and in my experience much higher than most people outside of language nerds and medical students ever try to learn. Another point in favor of most people aren't trying very hard to consciously upgrade their skills.

Expand full comment

"Most people aren't trying very hard" - there's certainly a lot to that. If I look back on the research I did in my early 30's - wow, I impress myself. Today, a couple of decades later, I definitely do not have the same motivation. Why? There are several possible answers, none of them entirely satisfactory.

Expand full comment

My personal theory is that early in our careers we need to break into a field by impressing people, and our total lack of knowledge prevents that from happening. Therefore we have to prove ourselves to others by learning a lot and working hard. After we hit a certain level, typically well above the average layman, maybe the average or above average within the field itself, there's far less push to prove ourselves. If you're the only engineer at the small factory where you work, you've got a clear edge over everyone else you see, so your place is secure. If you're one of 100 engineers at your firm, maybe you need to push yourself harder and only feel secure in the top XX or X%. That doesn't mean your firm of 100 engineers is top rated, maybe being top 5% in your firm is only top 50% somewhere else, but it's enough to attain and sustain status in your life, and you can concentrate on other things.

Expand full comment

So there's good things about being part of either a large or a small firm.

In a large firm you may struggle to get to the top 5% or whatever. However once there you might settle down and stagnate.

In a small firm you may struggle to be competent, but once you get there you stagnate. On the other hand you may keep going since you don't have someone to measure against. There's also no one to criticize your ideas, which could be good or bad.

Expand full comment

I have a hypothesis that the younger mitochondria are more efficient, or perhaps more numerous. I know that today, in my 70's, an hour of programming will nearly wipe me out, whereas when I was in my mid 20's I'd sometimes work for sprints lasting more than a day (though I did get noticeably worse towards the end of any particular period).

Expand full comment
founding

You can probably explain this effect with more mundane things like cardiovascular health. Heart disease does have some cognitive impairment.

Expand full comment

I don't think my explanation is even controversial. FWIW, I don't have cardiovascular health problems.

But it is well known that mitochondria degrade with aging. Exactly why, and to what extent is (AFAIK) uncertain. So that may not be the problem, it's only a hypothesis. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4779179/

Expand full comment

For me personally I think that the main reason I can't do work as well as I used to is I don't have the time or brain space.

Most of my best ideas I had while younger while letting ideas slowly percolate in my brain during downtime, e.g. while going on long walks alone. But I don't have time to go for long walks alone any more, I barely have a minute to myself, so I never have time to think things through at all. Intellectual effort that was once devoted to coming up with new ideas is now devoted to things like arguing with a three-year-old.

Expand full comment

I was waiting for this. There are some doctors I know that improve even as attendings because they are genuinely passionate about being the best they can be.

Expand full comment

IIUC, with neural nets, when they get full the new stuff overwrite the older stuff. It's not that they forget if they don't have new stuff to update on, it's that their memories are full, and the new stuff literally overwrites the older stuff. I'm sure this is partially dependent on the design of the neural net. Possibly one that splits things into categories could only overwrite stuff within the same category. But it's definitely not to needing to "feed in any new information". (I understood that as "needing new stuff to update one". If you meant something else, then this may be irrelevant.)

Expand full comment

> Wouldn’t you expect someone who’s practiced medicine for twenty years to be better than someone who’s only done it for two?

Isn't this explained by less experienced doctors receiving easier cases?

There was a study showing disparities in survival rates between black and white newborns, mediated by the attending physician's race. Basically, black babies die more often under a white doctor than under a black one.

Greg Cochran thinks this is probably confounded by the fact that harder cases get referred to specialists, and specialists are disproportionately white and Asian. There might be something similar here.

> This suggests an interference hypothesis: once there are too many similar things in memory, they all kind of blend together and it’s hard to learn new things in the same space.

I wonder if this is how memory palaces work.

Instead of having to remember an undifferentiated pile of stuff, now each thing has a kind of filing tag (an imaginary location), which makes it harder for concepts to get lost or crosstalk.

Expand full comment
author

These comparisons held positions (specialist vs. generalist) constant. Aside from whether someone is a specialist or not, I don't think there's any tendency for older doctors to get harder cases.

Expand full comment

As you get more experience, your job gets easier to perform. I didn't start writing for money until around by 32nd birthday in 1990, and even then I assumed I only had a few rare good ideas. For instance, in 1997, I only wrote two articles all year ("Is Love Colorblind" and "Track and Battlefield"). Granted, I was getting chemotherapy that year, but I wasn't working much either.

Over the next few years after that, though, I figured out it wasn't all that hard to generate decent ideas worthy of articles.

A lot of achievement is just realizing I can do this. Rock stars, for instance, typically hit their songwriting peaks shortly after they start to get some acclaim when they realize I can do this.

Then, after awhile, either the public or themselves start to get bored with the interaction of their personality and their talent. E.g., the public loved John Lennon after 1967, but he was starting to get bored.

Expand full comment

Too bad nobody seems to be doing extensive research on individual mental performance tied with GWAS studies. I imagine even proposing the idea is radioactive in most universities. Perhaps some rich prince will invest in learning about actual human capital.

Expand full comment

Consider Jeopardy. Who wins when a knowledgeable professor competes against smart undergraduates? Interference makes people slower, possibly because they have to “search” through more potential answers for a given input.

Speed is an important consideration for most doctors (especially surgeons). Maybe performance is a function of both speed and knowledge that maximizes ~3 years after residency.

I suspect writing articles on general topics without a deadline prioritizes knowledge over speed. You may never reach your plateau.

Expand full comment

Did your job as a marketing researcher also get easier with time?

Expand full comment

I’m not sure quality of work or productivity is a good proxy for learning. Especially for doctor cure rates. Coming out of their fellowships, most practitioners are as current as they will ever be. They improve from experience, but lack the ability to engage in full time training with top research specialists, and competing life obligations like running a practice, having a family and personal interests mean that their knowledge base and technique may not keep up with the state of the art. At some point the value of experience is eclipsed by getting behind the curve.

Really, I think achievement of key goals met with competing priorities explains a lot of productivity drop off. You’ve gotten tenure/made partner/published a couple of NY Times best sellers/etc. Having managed that, do you really want to spend your extra time thinking hard and pouring over some dry tome? Or go rock climbing/fly to Paris for the week/take the kids to the beach/go to some friend’s cocktail party?

That’s not to mention mental health and substance abuse. And for many creatives, becoming responsible and getting such issues under control can be just as damaging professionally as letting them run (albeit much better personally).

Expand full comment
founding

Not matching case difficulty with doctor competence would be institutional failure and a rather low hanging fruit to fix. I agree that it's possible, but I don't think it should be the default hypothesis. I'd like to see some evidence before I believe it.

There's also the possibility of consults - if for example you have strictly random patient assignment but have freedom to consult other doctors, then expertise is free to move and expertise per assigned doctor isn't really relevant anymore - outcome will depend on other factors.

Expand full comment

Tanner Greer wrote about this wrt public intellectuals but it can apply just as well to creative artists.

https://scholars-stage.org/public-intellectuals-have-short-shelf-lives-but-why/

"Fluid intelligence begins declining in a person's 30s. This implies that most humans reach their peak analytic power before 40. Crystal intelligence holds out quite a bit longer, usually not declining until a person's 60s or 70s. This is probably why historians reach peak achievement so late: the works that make master historians famous tend towards grand tomes that integrate mountains of figures and facts—a lifetime of knowledge—into one sweeping narrative.

Thus most humans develop their most important and original ideas between their late twenties and early forties. With the teens and twenties spent gaining the intellectual tools and foundational knowledge needed to take on big problems, the sweet spot for original intellectual work is a person's 30s: these are the years in which they have already gained the training necessary to make a real contribution to their chosen field but have not lost enough of their fluid intelligence to slow down creative work. By a person's mid 40s this period is more or less over with. The brain does not shut down creativity altogether once you hit 45, but originality slows down. By then the central ideas and models you use to understand the world are more or less decided. Only rarely will a person who has reached this age add something new to their intellectual toolkit.

Friedman jets from boardroom to newsroom to state welcoming hall. He is a traveler of the gilded paths, a man who experiences the world through taxi windows and guided tours. The Friedman of the 20th century rushed to the scene of war massacres; the Friedman of the 21st hurries to conference panels. What hope does a man living this way have of learning something new about the world?

More importantly: What incentive does he have to live any other way?

The trouble is that just as our historian reaches her full stature as a public name, her well of insight begins to run dry. A true fan of her works might trace elements of their name-making title back to the very first monograph she published as a baby academic. She was able to take all of the ideas and observations from her early years of concentrated study and spin them out over a decade of high-profile book writing. But what happens when the fruits of that study have been spent? What does she have to write about when they have already applied their unique form of insight to the problems of the day?

Nothing at all, really. Historians like this have nothing left to fall back on except the conventional opinions common to their class. So they go about repackaging those, echoing the same hollow shibboleths you could find in the work of any mediocrity.

In each case the intellectual in question is years removed from not just the insights that delivered fame, but the activities that delivered insight.

The tricky thing is that it is hard to go back to the rap and scrabble of real research when you have climbed so high above it. Penguin will pay you a hefty advance for your next two hundred pages of banal boilerplate; they will not pay you for two or three years of archival research on some narrow topic no one cares about. No matter that the process of writing on that narrow topic refills the well, imbuing you with the ideas needed to fill out another two decades of productive writing.

Public intellectuals who do not wish to transition in the their forties from the role of thinker to mentor or manager are going to have a harder time of it. Optimizing for long term success means turning away from victory at its most intoxicating. When you have reached the summit, time has come to descend, and start again on a different mountain. There are plenty of examples of this—Francis Fukuyama comes to mind as a contemporary one—but it is the harder path.”

Expand full comment

Yeah, this was the first piece that came to my mind as well. You probably need to cultivate a preference for "the harder learning path" somehow.

Expand full comment

Most famous intellectuals and artists are famous not for purely objective, abstract accomplishments but for the unique interplay of their personalities with their ideas. Later, they might want to and even try very hard to develop important new ideas later in life, but that's not who they are. It's who they are that allowed them to make their breakthroughs earlier in life, and it's who they are that keeps them from moving on to new originalities later in life.

If they get really famous, their followers go back and figure out the origin point of their best ideas. For example, Marxist scholars in the 20th Century came to realize that Karl Marx put forward many of his most intriguing and appealing ideas as the obscure "Young Marx" in 1843-44, and that, frankly, his famous later "Capital" is kind of a bore.

Expand full comment

Does this depress anyone else? I wasted my 20s on hedonism. Now my brain is decaying. Why wasn't I born when we had the tools to combat this already, god damn it

Expand full comment

It wasn't a waste if you were enjoying yourself.

Expand full comment

It makes me angry that we make young people spend so long in school rather than being productive.

Expand full comment

Yeah I have long been a believer that most people are probably peaking in value as an employee in their late 20s early 30s. Especially if they delay having kids.

Expand full comment

As I wrote in another comment, a lot of this boils down to motivation. After 45, a person becomes less interested in trying to make their mark on the world. Whenever I think about sitting down and writing something of publishable length, I look at the opportunity cost. I could be writing, or I could be living. That is not meant as an insult to writers, as for some the two are the same. In other words, for some, like Scott, writing is peak living. But, personally, I would rather be outside doing activities I enjoy, then in front of a screen, if I have the choice about how to spend my time. I did not feel this way when I was younger.

Expand full comment

What a great article, thanks for sharing! Almost laughed out loud with how eerily Jordan Peterson fit the publishing pattern (from ‘Maps of Meaning’ to ‘Twelve Rules’) he describes.

Expand full comment

Yeah Jordan Peterson...unfortunately...fits into the stereotype

Expand full comment

Does a memory plateau explain the alleged plateau in the quality of an artist's works at age 40 though, the finding that this blog entry opened with? I would think experience has other ways of affecting creative endeavours, many for the better -- for example, age can improve patience, create habits of consideration and discernment, and provide new kinds of experience that act as material for artistic work.

Expand full comment

I think that only works up to a point.

One thing I've observed when reading the biographies of famous artists is that the early portions - up to until they hit it big - are typically a lot more interesting than those dealing with the prime of their career. The latter tend to all follow the same pattern of "created new work, spent some time promoting it, went back to creating the next one".

It seems to me that once something becomes a career, the eventual falling into a rut is nigh inevitable.

Expand full comment

Yeah, I think that's a more convincing explanation. I do also believe that declining energy etc can affect the type of work artists create later in life, and it leads to work that is ('quality' aside) not as celebratory/universally accessible.

Expand full comment

From Tom Stoppard's "The Real Thing," written at age 45 about a playwright:

Debbie:[...] How’s old Elvis?

Henry: He’s dead.

Debbie: I did know that. I mean how’s he holding up apart from that?

Henry: I never went for him much. ‘All Shook Up’ was the last good one. However, I suppose that’s the fate of all us artists.

Debbie: Death?

Henry: People saying they preferred the early stuff.

Expand full comment

There's more than one reason. Artist are frequently operating off of a "grand inspiration". When they've expressed the low hanging fruit, development becomes more difficult. When they've expressed as much of it as they choose/can, they've got to develop a new inspiration...which is usually what someone else has done better previously...but perhaps that someone else wasn't famous enough to get heard. René Magritte vs. Andy Warhol comes to mind here, though René Magritte wasn't exactly obscure, or I wouldn't have heard of him. So this part is mainly a comment about how important PR is to an artist being heard. More famous artists can have distinctly inferior works become well known. Things that wouldn't have been publishable before they became famous.

Expand full comment

I recently made graphs of the Goodreads ratings of the novels of the friends and competitors John Updike (b. 1932) and Philip Roth (b. 1933). Updike's career looks like that of an athlete, peaking with 1981's "Rabbit Is Rich" when he was 49. Roth's career is less predictable, peaking both young (Goodbye, Columbus and Portnoy's Complaint) and old ("American Pastoral" when he was 64):

https://www.takimag.com/article/roth-vs-updike/

Updike, who wrote a famous article as a young man about 42-year-old Ted Williams homering in his last at-bat, had always had an athlete’s awareness of the inevitability of fading powers. Rabbit, for instance, peaked in high school. Updike was a close student of baseball statistics, so he likely wouldn’t have been surprised by Bill James’ finding that baseball stars peak at 27. Updike’s later books rather cheerfully chart his decline as if he were pleased to see his predictions fulfilled. So, maybe decline was a self-fulfilling prophecy?

Although Roth was also a baseball fan, he didn’t seem to foresee the impending inevitability of his decline as vividly as Updike did. As a social novelist with an elegant but not exquisite prose style, his knowledge of society only deepened with age.

Roth kept plugging away and really hit his stride in his 60s. His most admired novel may be 1997’s American Pastoral, in which a Jewish high school jock legend named Swede Levov, Roth’s tragic version of Rabbit Angstrom, has his beautiful life destroyed by the America of the ’60s.

Expand full comment

Thanks for this. Roth is the classic example! What is the energy that kept him writing at a lectern at a late age when suffering from back pain? The style of his books changed too, becoming somewhat more austere and Greek drama like in their distilled plotting and directness, and to me the high point in quality came somewhere in the middle of his move from maximalism to minimalism-ish. I actually think having a slightly less capacious memory in his 60s may have helped him focus down on emotional truths and make his late-ish books carry more weight per word.

But this life trajectory does seem to be rare for artists.

Expand full comment

"What is the energy that kept him writing at a lectern at a late age when suffering from back pain? "

I kind of wonder if it was the same energy that kept him busy with...other pursuits, of a decidedly less cerebral nature.

Expand full comment

I think Esther Perell once said that both work and sex can be fueled by erotic energy, for what it's worth.

Expand full comment

And...in Roth's case there wasn't much of a difference :)

Expand full comment

I think a lot of creativity is bringing something that exists outside a discipline to that discipline. So in your 30s, you become good-enough that you can execute on the thing you brought in from the outside. But then, you're on the inside. And so you're not bringing in anything new anymore.

Expand full comment

Umm, why use an economist to evaluate creativity?

Expand full comment

What's your alternative, and does it change the post's main takeaway?

(I'm actually curious. I thought the specific fact that Scott cited an economist was irrelevant to me since I already pretty much leaned towards this hypothesis, albeit from personal anecdata, so it would be good to see a contra take.)

Expand full comment

My alternative is to not use that example, as it is both surplus to purpose and poor quality. Beyond that, I’m not sure what the post’s main takeaway really is. “Something something memory, thus performance”? As a first take, Id suggest that for many people, as they grow older, they have mor and more distractions from their work life, so that could equally well explain that set of phenomena.

What do

You see as the main takeaway?

Expand full comment

I'd summarize the post as

1- most people's skills plateau in their late 30s

2- skills plateau due to memory decay (cf forgetting curves) and "interference" (things that are too similar blend together, so it's hard to learn new similar things)

3- decay and interference mostly explain skill plateaus, but not other related stuff like how interesting some random fact is

My main takeaway is (1) and (3). I buy 'decay' but am less convinced of 'interference' in (2). I interpreted your first comment as objecting to (1) -- maybe I misunderstood?

Expand full comment

Well, interference clearly happens with simple neural nets. Whether it happens in people is less clear, but not unlikely. There's also the question of "to what extent does it happen"? Are tastes likely to interfere with smells? With learning vocabulary? There seems (anecdotal) evidence that categories are handled separately at least WRT interference.

OTOH, I've seen claims that people could learn about 3 bits/minute. I never traced that claim to it's origins, and consider it dubious, as I don't know how such a thing could be measured. But it, or something analogous, MIGHT be true.

That said, I've heard reports of a person (he was at CalTech, decades ago, heard from a friend) who could read through a physics journal at nearly a glance/page, and remember every word. If so, the limit seems quite variable. My friend said he tested him on a random issue pulled off the library shelve, at a random place on a random page. Not a very thorough test, but a fairly convincing one. My friend said what page and the first few words, and the person completed the paragraph and a bit more. Well, this is second-hand when I say it, so it's third-hand to you. But verbal (including numbers and physics equations [and presumably diagrams]) seems to be highly variable in it's retentiveness.

Expand full comment

Is there a way to exploit interference to improve recall?

Using different environmental cues. Take it from the man himself.

Robert Bjork - https://www.youtube.com/watch?v=oxZzoVp5jmI

The whole lecture is really worth a look.

Expand full comment

Thank you. Has he written an introductory book as an alternative to a podcast?

Expand full comment

I have not read his "Memory (Handbook of Perception and Cognition)." It might be what you're looking for.

There’s a festschrift, but don't think it's a great introduction. https://www.amazon.com/Successful-Remembering-Forgetting-Festschrift-Robert/dp/1848728913

My original link is to a public lecture. I don’t like podcasts (slooowwww), but I think the lecture format works well at 2x speed. I liked his pictures of penguins.

Expand full comment

I've been playing videogames my entire life. I can observe myself slowing down as I age and the rate of decline seems to be exponential. 35 to 36 was not as significant of a decline as 36 to 37.

Expand full comment

I played a lot of games in my teens and into my twenties, slowed down for a while, and now find it more difficult to play certain types of games. Some of it is interest (I know what types of things within games I genuinely enjoyed and which I didn't, so some games just aren't worth the effort), but clearly some of it is actual ability. I learn new games more slowly, and struggle with some aspects that I don't recall having a problem with before - especially twitch and using game controls. On the other hand, I find that I am more observant of details and game interactions, and can generally be more consistent in performance, since performance is based on knowledge more than skill for me at this point.

Bottom line, sidescrollers like Mario I struggle with to the point I really don't enjoy them anymore, but deep strategy games (especially turn based) I am better than I used to be.

Expand full comment

This is mostly due to decreased reaction time and coordination though. Those decrements are less relevant to creative artists and physicians

Expand full comment

Wouldn't it make sense that creatives do their best work around then? If you use your best ideas sooner than your bad ideas, and if you need maturity and life experience to draw on, that seems like the right age to me.

In general, it seems to me that people don't pedal to the metal throughout their lives. They have zeal and energy and a desire for success, and then if things are going well, they settle into a routine and take on family responsibilities and get exhausted easier and so on.

Expand full comment

Two things come to mind which seem relevant.

The first is a saying I know from the music business, but which applies just as well to all creative work: "You have your whole life to prepare your debut... and six months to prepare the follow-up."

The second is the Red Queen's bit from Through the Looking-Glass: "Now, here, you see, it takes all the running you can do, to keep in the same place. If you want to get somewhere else, you must run at least twice as fast as that!"

When you're starting out, even a small absolute improvement translates into a big relative improvement. If you knew five Spanish words and you learn five more, your knowledge of Spanish has doubled. If you already know five *thousand*, learning five more won't be a very noticeable improvement.

Add thereto the fact that you will typically have considerably more time between the moment you decide to acquire a skill and its first application to a real problem (how many years, would you say, is it between deciding "I want to be a doctor" and being able to truthfully say "I am a doctor"?) than you will between this first application and all subsequent ones and the plateauing of ability becomes much less mysterious: there simply isn't enough time available to get the same sort of visible improvement as previously.

Expand full comment

I need to be really careful about what I spend my mental effort learning and practicing...

Btw, where does muscle memory come into this? I can do very complex fine motor skill things with my hands even after many years not doing it at all.

Expand full comment

The study doesn't really measure peak skill so much as career success, therefore, the answer probably relates strongly to career incentives. An artist's ability to produce a work that becomes famous depends on how they are perceived. Age is a big part of that. Moreover, if people have already succeeded by their late thirties, they may be incentivized to just keep repeating the same pattern, perhaps with minor variations.

Also.....the topic of the post is much too large. Putting spaced repetition and peak creativity in the same conceptual category is a bit of a stretch.

Expand full comment

Actually, I don't think they generally keep repeating the same pattern. Joan Baez and Bob Dylan switched from folk-acoustic to electric guitar. But I feel the decline in their progress predated that switch. I think they switched because they realized "the well is going dry". Bob Dylan came to prominence on a wave of social unrest. He was a PART of that wave, but was able to express it more clearly than most. He switched to social activism ("Hollis Brown" to "The Ship Comes In"), but he wasn't as clear about the new theme. Then he switched to more personal themes about the time that he switched to electric rock. And then he sort of vanished from my sight as someone that it was unpleasant to listen to. (I dislike electric rock. I'm not even thrilled with ordinary rock. But if I were to listen to it I'd prefer The Beatles.) (Don't take these category names too seriously. I don't know enough about the field to know what names are correct. Just what I listened to.)

The point here is that they didn't stay with the same themes. But it was their first theme that made them famous enough to be heard. The secondary themes they may not have had anything special to contribute to...except being famous.

Expand full comment

I see what you mean. I didn't mean that every single artist would follow that pattern, just that most of them might. For example, look at Wes Anderson, who achieved success with a his characteristic aesthetic style when he was in his early / mid 30s, then has mostly kept perfecting that single style. I imagine that there are many reasons why he and many others would want to do that. But of course there are exceptions. I guess I am a bit skeptical that creative artists' appear to be peaking in their late 30s because of interference, decay (both of which seem to relate more to language learning than creative success), or the decline of G / general fluid intelligence (which seems to relate more strongly to things like math, theoretical physics, etc), and think that it might be more fruitful to look at context, incentives, and how they are perceived. But I agree that not everyone is like that.

Expand full comment

That's one way to reframe your famous old in-the-trenches SSC post, "Who By Very Slow Decay"...

On 9/11: I always found this to be a strange case. As a callous sunnuvabitch, the actual day itself had basically no impact on me personally, and I spent most of those WoT years very confused at lots of people everywhere getting really mad about it...but I do remember where I was that day, because *so many people forever after kept saying they'd never forget where they were that day*. In Decay terms, that's spaced repetition daily/weekly for years, and I still get multiple occasions on a yearly basis of that same meme. It's never stopped. So of course an otherwise unremarkable memory was made out to be extremely prominent in my mind, more than I'd have afforded it otherwise.

Contrast to 1/6: I have no idea where I was or what I was doing on that day. Or other things like major holidays, birthdays: mostly forgotten. I don't exactly have a surfeit of geotemporal-day data to keep track of, interference-wise. I am generally really bad at remembering such dates period! So either there's some missing factors in the Decay model, or I'm simply a bad doctor and am already maxed-out at 5 diseases.

I'm also curious what the constraint on Interference cap is. Surely not willpower, or there wouldn't be entire cottage industries devoted to hacking memory capacity. Maybe a genetic component? Maybe that cap is, itself, a form of learned skill, and one can gradually learn to keep a greater number of similar-concepts in head simultaneously in shorter time intervals? (I'd like to think "learning to learn" pedagogy covered this, but it really doesn't seem to.) It's surely also confounded by things like autism...intensive interest in narrow topics would, of course, mean memorizing an outsize amount of similar-facts. If there were a way to consciously choose such interests for more "productive" ends...that could be quite interesting to research, and maybe extrapolate to the "neurotypicals".

Your example of Portuguese Conspiracy is interesting. I don't think anyone's ever made explicit a "vertical integration" model of learning...like, a horizontally integrated education is just classic liberal arts, learning a lot of unrelated subjects, which should help get around Interference. A vertically integrated education would be...like...the equivalent of full stack programming, but for other topics, I guess? Lots of discrete knowledge, different enough not to Interfere with each other, but ultimately culminating in the same meta-skill. Like, learning to cook could include: knifework, mastery of various preparation methods, memorizing recipes, deducing "Salt, Fat, Acid, Heat"-esque chemical relations, learning how food is grown, best ingredient procurement practices, plating and presentation, pairing with alcoholic beverages...etc. I'm not sure how broadly applicable this framework is, since some skills really are just hard to chunk into discrete parts, and outright deception feels like it shouldn't work. (Then again, I'm not sure anyone's ever really studied how "effective" stuff like Christian rock or Ratfic is in stealthily imparting those lessons, compared to/on top of getting the real thing at same time. That'd be a fascinating dissertation.)

I think that's one additional conflation made here - between "skills", the active doing/recall of something, and merely passive informational memory retrieval. Salience should be expected to increase if memory formation is paired with other cues...I couldn't actually tell you the date of the last Presidential election (sorry! really bad with all dates!), but I sure remember the where-and-what, cause it was the only time in my life I went to an "election watch party". One of the few occasions where tears and champagne mix. It's hard to live down drunken sobbing, even if I'm sure I'm the only one who remembers doing it. Compared to, like, learning Spanish or cramming the Medical SAT...yeah, someone might be really motivated to do these things. But they're ultimately academic, and I doubt often attached to salience-triggers. (Perhaps this is part of why individual tutoring has such outsized educational gains? Because intensive one-on-one teaching includes a lot more potential salience-weighting vs. industrial-style mass learning?)

Expand full comment

I like your point about skills, avalancheGenesis. I expanded on it in a comment below without citing you, since it was "below the fold" here and I hadn't yet read your full post.

Small point: 9/11 and 1/6 aren't comparable, because 9/11, like the Kennedy assassination, was a shocking surprise that disrupted the course of an ordinary day. 1/6 was an outcome of a closely watched series of events--most of us, perhaps, were checking in on the news at intervals already (and "where we were" was watching a screen). I think when it comes to politics, "Pearl Harbor" events are in a class of their own. Many of us have family Pearl Harbor events as well--we easily recall where we were when we heard the news (the date stuff would indeed depend on later reinforcement--we don't often date our memories).

Expand full comment

Thanks. Yeah, brevity isn't my strong suit and I'm not usually fast enough to get first-mover comment advantage.

Fair enough on 1/6. It mostly falls in the same bucket for me, because there's only a small number of __numerical dates__ I feel like I keep getting badgered __over and over by everyone__ to remember...lots of other events, like Pearl Harbour, people just refer to it by the event name. Pearl Harbour's reference pointer is the string "Pearl Harbour", not the $DATE_FORMAT "12/7/1941" (which I had to look up, cause I'd never had reason to know the actual date it happened until just now). There are a few edge cases, like associating the Guy Fawkes Gunpowder Plot with the 5th of November...but even there, does anyone ever recall the year of 1605? I think there is some peculiar etymological phenomenon where history decides to refer to something by a name, or a number...and the rareness of numbers makes them extra-salient. They're the One Rare Disease of event-diagnosis.

Expand full comment

In China, the date-name format is a long tradition, so people find it natural to incorporate new ones (for example, Tiananmen was easily accommodated as 6/4 in a tradition that marked the Oct. 10 end of the Imperial Era as "Double-10," the political upheaval against appeasement of colonialism as 5/4, and so forth). My sense is that Chinese kids learn their history calendars better than kids in the US.

Judging by your spelling of Pearl Harbor and your example of the Gunpowder Plot, you're not American. US citizens get bombarded with history on Pearl Harbor Day (which always falls on 12/7), and, of course, our national self-celebration is always called "July 4th," rather than the formal "Independence Day." Few of us over here are alert to Guy Fawkes Day, but because I had a childhood friend with that birthday, it's lodged in my mind (that's the way it works)--and now 1605 may be, since you caught me not knowing it. Americans of my generation may be more inclined toward date memories because national holidays lined up with memorial dates (Washington's Birthday, Columbus Day, etc.). But some years ago they all became modeled on August Bank Holiday Monday, prioritizing three-day weekends over history. (Some day we'll invent a way to make July 4 a perpetual Monday.)

Expand full comment

I am a 3rd-generation Chinese-American*, actually, and can confidently state that numeracy (or whatever skill normally helps with learning history-calendars) definitely didn't get passed down either genetically or culturally...one of Mother's favourite stories is the time I asked her, "Mom, what did you do during WWII?" (This was in the 90s; she was 40something.)

Ngl, I didn't even know Pearl Harbour Day was a thing on the calendar - it's definitely not something I ever remember being taught history about, outside of U.S. History in 11th grade, probably. Hopefully. It, uh, wasn't the best public school. Good point about July 4th - although again, I find myself counting off in my head to remember that's 7/4 rather than 6/4. Because I can only remember May is the 5th month, my birthmonth, and always count forwards or backwards from there...calendars are hard. (For Guy Fawkes - maybe it's more a nerd thing, I know it from __V for Vendetta__ and BBC's __Sherlock__. Might have come up in 10th grade World History too, I don't remember.)

Decay-wise, the funny thing about working in retail is that I'm *more* forgetful of holidays than before, even the 3-day weekend ones that are always Yuge Sales for us. I think it's because they've changed reference-class: as one of those essential workers who lets everyone else shop and enjoy holidays, obviously, I never get them off myself anymore. So it's just another workday, a dry academic piece of information rather than an event-to-experience-yearly. Due to how terribly stressful many of those holidays are, I think perhaps I actively try to forget them too...every year I swear "never doing another Christmas season again", but always chicken out. Holidays are regressive redistribution - they take happiness from retail workers and give it to consumers...

*At some point I did pick up a British affectation for "unnecessary" vowels, which...not sure why? I guess I just like spelling words the way they actually sound in my head, and those spellings tend to be more the...correct...colour. It's not an actual accent I affect irl. Although most of my online-gaming buddies growing up were, in fact, Euros of some stripe. So perhaps, via Decay, I just got super used to their way of UK-English-spelling many things? No matter what my American teachers actually taught in school.

Expand full comment

Online interactions are filled with surprises, aG. Sorry I was so sure you were a Brit--so much for my career as a clever sleuth.

Pearl Harbor Day may be generational. I buy pretty wall calendars at the mall--do younger people do that (my middle aged kids find it quaint--their calendars are apps)? The one now hanging in the kitchen will remind me of the arrival of the 7th when I turn to December (I just checked), so it is indeed "on the calendar." I suspect that as my Boomer cohort salutes farewell, that particular reinforcement of traditional civic consciousness will fade.

My politics favors economic redistribution, but I hadn't factored in the variety you analyze. Christmas sounds like combat. Thank you for your service!

Expand full comment

Well, I dunno if I still qualify as "younger people", but I don't know anyone who keeps a physical wall calendar anymore. Unless it's of the joke-a-day kind with New Yorker cartoons or whatever. It's phone apps and Google Calendar all the way down. And I just checked, my phone calendar does not, in fact, have Pearl Harbour Day listed at all. (It sure does include a lot of other obscure holidays though! Weird priorities.)

Appreciate it. Our company only now decided to start giving out bonus pay for working holidays, which softens the blow a little at least. Funny how more money sure makes a lot of life's indignities more bearable...

Expand full comment

An under-noted effect of 9/11 is that I believe it started the entire...meme? not quite...of referring to a significant traumatic date by that date. You see this idea more in fiction than in reality, though; Mr. Robot had (I think) 5/9 as a significant date, and I've seen a couple of other uses that I can't recall now. I definitely remember a news story from the end of the week of 9/11 along the lines of "What are we even going to call this huge thing that just happened?", since "terror attack", or even "series of terror attacks", didn't seem up to the job.

I think January 6th (not 1/6, I don't think, in popular usage) is a bit different; part of the problem, again, is that it wasn't exactly clear what to call it. Everybody here is aware of the contentiousness of the term "insurrection" for it, and "riot", again, seems not to quite capture the importance of it.

Expand full comment

I think you're right, Don. Part of the reason we settled on 9/11 may have to do with the emergency number 9-1-1. I recall people using that to refer to the event, and others feeling it trivialized it. The latter group prevailed, but the way 9/11 gradually pushed "September 11" to the side may have been due to the momentum imparted by 9-1-1.

Expand full comment

I think with writers, perhaps it's a case of using up all the best ideas, along with the increased energy and motivation to "make it" earlier on.

As for the doctors, maybe it's that the younger doctors are updated on all the new medical techniques while the older ones may have some outdated knowledge? I know with software engineers (which I am one), older devs are often a bit more reluctant to jump into the latest hip dev tool, simply because it's tiring to learn new things all the time.

Expand full comment

Also, perhaps selection bias + regression towards the mean in the case of writers. Perhaps if their early works aren't brilliant, they get written off as mediocre writers, and never make it into studies like these.

Expand full comment

Relatedly: in creative fields, there are aesthetic and theoretical trends, as well as (in the last ~50 years) increasingly many computerized tools for doing new and interesting things. E.g. When I was in college 5 years ago, I learned the hip new digital tools and was exposed to the newest iterations of the theories of my field. My 60-year-old boss doesn't have that recent exposure. He probably spends more time than me trying to keep up with new aesthetic trends, because I absorbed them through osmosis in the university environment and he has to consciously pursue them. And if I don't continue to actively study the direction the field is going, eventually I will also get left behind.

Expand full comment

I would wager that even if you *do* study fashion trends, you'll get left behind. There are always LOTS of different trends at the same time, and only some of them will every gain a large following. And, of course, many of them will be about explicitly rejecting the current way of doing things.

Expand full comment

" know with software engineers (which I am one), older devs are often a bit more reluctant to jump into the latest hip dev tool, simply because it's tiring to learn new things all the time."

One other factor:

It gets obnoxious to learn to how to do _the_ _same_ _thing_ on yet another slightly different application. For instance, I must have used more than a score of email systems over the decades. It gets unpleasant to go chasing "ok, where did they hide the reply-to-all feature on _this_ one?". ( This also ties into interference: Keeping track of where a given feature was placed in several different e.g. email applications, all of which one uses daily is a pain. )

Expand full comment

There was an interesting study on flashbulb memories, in particular 9/11 memories: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2925254/

It has some interesting data on the rate of forgetting. When asked 11 month after the event, the correlation with the original recollection drops to .63; and two further years later, it drops further to .57. This seems to indicate that what we don't forget within the first year, we're less likely to forget later (at least for lightbulb memories). Also, the confidence level was quite high - so, even as people were misremembering the details, they were still sure they got them correctly.

Expand full comment

Well, I remember where I was when I heard Kennedy had been shot, and where I was when the Loma Prieta earthquake hit. I remember basically where I was when I heard Sputnik was in orbit. I don't know where I was when I heard about 9/11. Actually, I can't think off-hand, of an other "flash-bulb memories" that I've got (well, except for a couple of personal ones that have no public tie-in).

Expand full comment

Part of it is probably intensity. I'm sure anyone who went to college noticed that some people studied really hard and did well and others hardly had to study to do just about as well. I have an acquaintance who I took some upper level math courses with where the homework was not graded. I vaguely understood the lectures and had to do the homework to learn the material to the point where I could solve problems on it. He never did the homework and only attended the lectures. We did basically exactly the same on tests. I have no doubt that if he tried even a little bit harder he would have developed a stronger grasp of the material than I did and that the upper bound on his mathematical ability is quite a bit higher than mine is.

Related, I've lifted weights since I was about 13. Around 21 my performance plateaued for the first time. I kept following the same routine and never got any stronger. Since my goal wasn't to lift massive amounts of weights or have a bodybuilder physique this was fine. When I was ~27 I started doubling the volume of weights I lifted in workouts. Suddenly my lifts started going up again and my muscles started getting noticeable bigger (other people commented on it).

In summary: intensity x time = results. Most people cap their intensity at some point and stop putting in more time, so it's no real surprise they plateau. It's also no surprise that those plateaus can be easily broken out of for most of them.

Expand full comment

My favorite model is that Learning is INEFFICIENT in the computer science sense, the effort is exponential or super-exponential in the amount learned. The exponents themselves vary a lot for different people learning different subjects or fields. But the pattern is the same: it's easy at first, getting harder and harder and eventually so hard that the other factors keep you from learning and retaining more, and even if they didn't, the difference becomes less and less measurable.

It is often quite transparent early on where one's limits lie: a good scout can tell how far a potential athlete can progress. A good teacher can see early on which students are promising, and my contention is that they see where the person is on their personal exponent of learning. There are always exceptions, but that's what they are.

In some fields the exponents are naturally very steep for most people, and in some fields they are very shallow, but also have a big constant in front. For example, an orchestra conductor does their best work in their 60s or later, but it is a lot of effort to get there.

I have this contention from personal experience, with learning as well as teaching, but also by watching others. For example, I am above average in math, but not above average of a math undergrad, so I hit my math limit when learning grad-level math: spending more time and effort on a topic led to diminishing returns pretty quickly, resulting in frustration and loss of interest. And also realizing that the incremental knowledge I worked so hard for fades away from one day to the next. I have also seen two apparently different types of struggling students: some learn slowly but gain knowledge from day to day and keep most of it. Others learn fine at first, but then appear to "hit the wall", and no amount of extra effort they put in gets them anywhere. This fits the model that

Effort = C * exp(knowledge * difficulty)

where the two constants, C and difficulty, can fit a person's learning of a given domain quite well. (Of course, once you hit the "knowledge wall", you need second-order effects, like decay.)

Expand full comment

"Why is getting an introductory understanding of twenty fields easier than getting a masterly understanding of one? "

Presumably, introductory materials are easier? Or at least less technical, and therefore more familiar and accessible.

Also, It seems important to ask whether a field advances or not. A new doctor will be crammed fresh full of new techniques and up-to-date knowledge.

An expert on a more static topic might be able to progress further over time, if breadth of knowledge was more important than being up-to-date.

Expand full comment
author

"Presumably, introductory materials are easier?"

Yes, but what does "easier" mean in this case? What makes them easier? What does it mean for something to be less technical? Why does less-technical-ness promote understanding and memory. These are the questions I'm trying to answer.

Expand full comment

My recollection of majoring in economics a long time ago was that Econ 101 Micro and Econ 102 Macro were tough, but, having mastered the concepts in those two, the rest of the major was surprisingly easy.

But Econ is an extremely popular major now, so they've made it a lot tougher since then.

Expand full comment

I think "easier/harder" may not be the appropriate scale, or "knowledge" an unambiguous measure.

When you begin to learn in a field you are assembling diverse components that can become integrated in an intellectual or practical skill. Most may present as facts to be memorized, but some require mastering intellectual or practical skills more expansive than those you possessed originally in order to deploy a new fact inventory. Those skills, once mastered to a sufficient degree, allow facts to be accommodated and applied more easily in tasks addressed by the skill set, but the facts may be of less intrinsic salience (and memorability) because they are no longer being as often used for skill building purposes. Once you're "fluent" in a second language you still learn new words and structures and occasionally forget rarely used items, but mostly you use the language to do things. New items are no longer central to the project you're engaged on, and most people reach some level of terminal fluency: they never get much (or any) better at grammatical precision or accent once the language is adequately mastered as a tool because the language has become a vehicle, rather than a destination.

As for decay in creativity, I'm not sure that's being framed correctly either. I think outcomes may be reflections of routinization, and older individuals who are dissatisfied with routinization may show as much creativity as their younger selves if they determine to alter their skills. (I have Beethoven, Yeats, and Pissarro in mind as artistic examples.) But if your skills are mostly means to earn a living and support routine but satisfying social rewards, you may be less inclined to put those goals at risk. Metaphorically speaking, if you're reaping rewards applying mastered Spanish, is it a good choice to turn away from working in Spain so you can study Arabic and work in Morocco? (Perhaps not a great analogy, since language acquisition ease seems intrinsically to decline with age, but the creativity that's germane would be in the work, not the degree of language fluency. . . . And, on second thought, language acquisition ease does seem to increase as more languages are mastered.)

Expand full comment

I'd guess that 'introductory materials' are less likely to be built upon unfamiliar concepts. Introductory materials are, by necessity, related to common knowledge while more advanced concepts are based on recently learned and jargon laden subject matter.

If you're building a pyramid and some of the blocks in the base are cracked, that threatens the blocks above them. If you're learning complex material and you imperfectly learned some of the basic material that the advanced material is learned on, that threatens your understanding of the advanced material.

Also, if a more advanced subject invokes more concepts at once then that could also put more stress on working memory.

Also, "advanced material" is more likely to involve math. :-p

I'd throw in one more thing I've encountered; the further you stray from the mainstream the worse the quality of narrative craft tends to be. Selection of really advanced material may prioritize subject matter expertise over writing or teaching ability. Even if we generously assume that SMEs for more technical material don't lack an ability to communicate, advanced material is still going to offer fewer texts to choose from.

Expand full comment

Very interesting topic , I have memory issues going back to my very young years. I found it was more a filter effect, what captured your attention and what you ignore. And after years in a specific expertise, no matter what it is , ego plays a huge part after knowing your craft . Those who see this in themselves, if wise . Knowledge and wisdom is the definitive issue. Wisdom to know when you don't know , to seek out information or help from others who do. Being humble. Too often especially in situations that are time sensitive and a life is on the line as in medical practice, instinct plays a huge role. Training and procedures, ingrained reactions. But being able to flow through the ever changing science or technology . Some just let go and basically think there fine with what they know and leave critical thinking to someone else. And relying upon others to tell them what to think. Unfortunately this is a very very common problem. I don't think age is the critical issue, but the desire to continue learning is. My opinion. Awesome topic , made me think :) I like that 😀.

Expand full comment

The full explanation probably contains a combination of the interaction between short and long term memory (what you were doing on 9-11 is a classic case of episodic memory getting on the long term highway) and diminishing returns for learning (the first ten Spanish words you learn make you more better at Spanish than the 500th set of ten words (I've always wanted to grammatically correctly write more better!).

Also, as a Portuguese speaker, I say: ouch, and well played.

Expand full comment
Aug 18, 2022·edited Aug 18, 2022

Ha, I wrote my PhD thesis on memory interference in cortical representations of specific memories and how those relate to what we see in neural networks. I left out some of the conclusion and it'll never be published though because I don't want to be responsible for skynet (half sarcasm).

You don't actually remember where you were on 9/11 any better than you do 9/13 though, you just have a much higher (false) certainty that you remember 9/11 correctly:

https://pubmed.ncbi.nlm.nih.gov/12930476/

https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2925254/

This doesn't seem to be common knowledge even in the neuroscience community for some reason, but it seems to be a pretty solid result.

Expand full comment

>You don't actually remember where you were on 9/11 any better than you do 9/13

No, I actually do remember far more about 9/11, specifically the moment when I saw the first tower collapse. I even remember calling a friend and wondering if WW3 was next. Zero about 9/10 or 9/12. I guess I was probably in the same city as on 9/11, since I wasn't travelling much at that time, but that's about it.

Also, I remember Columbia's reentry failure, I know where I was and what I was doing.

I mean, sure, most of those days are completely forgotten, but specific moments were somehow burned into memory.

So, congrats on your thesis, but just how thoroughly did you attempt to replicate it?

Expand full comment

People actually do remember the major events of 9/11 (planes crashing into a building) better than 9/10 (bad coffee) and 9/12 (great sex), but not where they were or what they were doing any better. They think they do though, except for you (you know).

>So, congrats on your thesis, but just how thoroughly did you attempt to replicate it?

I made sure it was statistically valid, other people have replicated many parts of it over the last decade.

Expand full comment

No, I can remember my wife waking me up and telling me about it.

Similarly, I can remember coming back from kindergarten to my friend Danny Rich's house on 11/22/1963 and him telling me the President had been shot.

The night before, my father had told me never to stick a fork in an electrical outlet or I'd be shocked. So, when Danny told me that the President had been shot, I had a vivid image of a very naughty President Kennedy down on all fours grinning mischieviously as he stuck a fork into a White House electric plug, so I told him, with complete confidence, "No, the President has been shocked."

Expand full comment
Aug 18, 2022·edited Aug 18, 2022

Yeah that makes no sense as far as human memory is concerned.

I have some faint recollection of my father telling me to come watch TV on 9/11/2001, but I was bored and wanted to play with my toy cars.

I have not foggiest idea what happened on 9/10/2001, nor 9/11/2002, nor 9/11/2021.

I do remember we were memorizing multiplication tables in math class in school around then (though I have to confirm my counting the years). I remember slightly better the experience of having difficulties with recalling times table of 7 and 8 as fast I would like, and being stressed about it before the exam, but I have absolutely idea of the exact date when the quiz was. I think it was it was also the year when I got in a fight and punched one of the boys in my class, which I remember very vividly (much better than 9/11). I have not foggiest idea which date it was: I wasn't paying attention and nobody cared to tell me. Everyone told what date the 9/11 was.

Expand full comment

>They think they do though, except for you (you know).

Ok, could be, fair enough.

I wonder how I could test whether my memories are accurate, since it's really a snapshot of a particular moment, and I think it's very unlikely that other participants have stored snapshots of that same moment. All the falsifiable details seem to fit - the house where I remember being was indeed where I was living at that time, the people match too, time of day.

How did you check whether your subjects' memories, that they thought were true, were actually montages or whatever? Or am I gravely misunderstanding what you're saying (if so, sorry!)?

Expand full comment

You can't really check the accuracy of a "flashbulb" memory if you don't have video evidence, you can only compare it to what the person recalled to you at other time points. People don't really store memories as snapshots (this why we have memory interference), trying to recall a snapshot necessarily draws on resources that are involved in other memories.

Expand full comment

Ok, so how does one check your theory? Is it falsifiable at all?

The way it sounds, is you're claiming to have "solved" memory, produced a prediction, and when someone says the prediction is wrong, you say it's because he's deluded.

I mean, he/I could be, but so could your theory, right?

This seems important enough to badger you about it :)

Expand full comment

Honestly I suspect TH of trolling.

I have flashbulb memories of both the Challenger disaster and 9/11, remember the exact place and situation and have a picture in my head of it. These were banal situations, first was the school cafeteria, second was walking in the center of Prague when friend's gf called and told him people were flying planes into WTC.

These aren't things I've been recounting for years, or even once irl afaik, I just remember very clearly these specific things in a "flashbulb" manner. I understand and believe that memorable events recounted over and over again can easily become distorted, perhaps they weren't indelible at the time due to stress or repetition, like war stories but became worked into memory.

The idea that TH thinks he can prove exactly how people remember stuff and (haha) even had to leave stuff out of his thesis (half sarcasm) is more than half bullshit.

Expand full comment

Where did I claim to have solved memory? I claimed to study memory and worked on memory interference for my thesis. The note about what I study is unrelated to the claim about flashbulb memories (which is not what I study). I linked to two studies on flashbulb memories for you to read if you disagree with my two sentence summary (it's not my theory).

Expand full comment
Aug 18, 2022·edited Aug 18, 2022

>You don't actually remember where you were on 9/11 any better than you do 9/13 though, you just have a much higher (false) certainty that you remember 9/11 correctly

That doesn't seem right to me. If literally *any single detail* I remember about 9/11 has some element of truth to it (e.g., I remember that my father called and said he just wanted to hear my voice, even though he knew I wasn't in NYC; he and my mother remember this too), then I remember 9/11 better than 9/13 — because I do not remember a single thing about 9/13.

Expand full comment

Could you expand on this? Like all the other commenters, I remember many details of what I was doing on 9/11 with certainty. I also don’t remember some details, like what people around me were wearing for example, even though my memory has filled the image in with clothes. I can see stuff but know that that part may not be accurate. Does the effect only apply to people who experienced actual traumatic stress from the event? We must be misunderstanding.

Expand full comment

Memory doesn't work like a video camera, it is just the strengthening of connections between groups of neurons spread out across a variety of different brain areas. Recalling memories makes them stronger and longer lasting, but it also makes them easier to edit or incorporate other memories into them. We are very accurate at remembering the fact of 9/11 and that it was important, but our brains are secretly trying to incorporate that fact that we remember into our general mental map of the world and this results modification of the details. Unfortunately, our certainty about the fact of 9/11 gets mapped onto all the details we fill in around it.

Expand full comment

Oh I see, interesting. So the memories are more susceptible to editing from later input than standard memories, but it’s not that they are probably inaccurate. They are just more likely to be inaccurate than a standard memory. Is that right?

Expand full comment

Sorry, I seem to be giving the impression that that flashbulb memories are less accurate than normal memories, which is not the case. It is just that we are more overconfident that they are correct than we are for normal memories.

Expand full comment

That's definitely not true for me. I am 100% certain where I was when 9/11 happened and exactly what I was doing. I don't remember 9/13 at all.

Expand full comment

Me too. We are both probably confabulating.

Expand full comment

Yes, that could very well be true.

Expand full comment

I think your position is absolutely incorrect.

I know for a fact where I was, because I distinctly remember my homeroom teacher having the classroom TV on as we all kind of numbly watched the screen. I remember one person (my teacher), and my location (homeroom). Those are for sure facts.

Expand full comment

I really like how when I link to a study that finds people are extremely overconfident about their recollection of 9/11 everybody tells me how confident they are.

Expand full comment
Aug 18, 2022·edited Aug 18, 2022

I know literally nothing about any of the other days in that entire month. I have memories of the location I was at on that day.

EDIT: Took a quick glance through the papers, and they don't support your claim anyway. Yeah I am sure people's memories do indeed get distorted over time. They still remember 9/11 MORE than 9/12.

Expand full comment
Aug 18, 2022·edited Aug 18, 2022

My main point, which I admit I was overly snarky about, was that the main difference between flashbulb memories and normal everyday memories is confidence NOT accuracy. People think they remember a bunch of details and will get angry with you if you tell them they don't, but when you compare their recollections you find the details have changed at the same rate as normal memories.

Expand full comment

I really like how when you link to a nebulous study in a very nebulous topic to study (perceived memories) you become extremely smug and self-righteous.

Expand full comment
Aug 19, 2022·edited Aug 19, 2022

I find similar events 'fuse' into a single memory of a single very eventful event (example all christmas family gathering at a specific place becoming a single evening). This fusing is stopped when the event is tied to something very specific that defines the event (location, season, ..)

Expand full comment
founding

Do none of you people keep diaries? Or logs, journals, whatever. It's easy enough to check whether your memories of e.g. 9/11 are accurate, and as it turns out mine are pretty solid. 9/13, not so much.

Expand full comment

Memories, neurons, connections are enhanced when a strong emotion is tied to the memory. Strong memory forms from strong emotions.

Expand full comment

Exactly, this immediately came to mind when I read the section on Hebrew words.

Maybe there's a way to exploit it to improve memory? Like, if there's some fact you want to remember, you can get really mad at it? Though I'm not sure this can be faked

Expand full comment

Probably amygdala involvement.

Expand full comment

What I didn't see any commenters pick up on is the concept of deliberate practice in expertise. I haven't made any in-depth research into it, but besides plateauing through decay and interference, skills can be developed systematically. This seems to be a good overview article: https://fs.blog/deliberate-practice-guide/

Some of the components are:

1. having a teacher or a trainer that makes sure you're constantly on the edge of the development curve, i.e. he stretches you past of what you know, but just a bit.

2. having (constant and instant, preferably) feedback, i.e. you have to be able to couple your actions to successes tightly.

3. experiencing mental demand - I can confirm this from my climbing experience. When the route isn't challenging, I climb on autopilot, and there are presumably little to no gains. Mental demand is presumably linked to attention, i.e. being able to hyperfocus on whatever you're learning might lead to outsized gains. I vaguely remember there might be some mechanistic understanding in this Huberman Lab episode: https://hubermanlab.com/understand-and-improve-memory-using-science-based-tools/

4. Repetition - each skill consists of smaller elements, and being able to perform these subskills automatically frees up RAM for tackling other components of the skill. That's why, for instance, when you learn swimming, you try to isolate the movement of individual limbs while holding everything else constant. When you being to habituate to this subskill, you start including more and more limbs as your RAM frees up. This seems to be linked to #1 and generally always looking out whether what you're doing is within or outside your comfort zone.

To sum up, I'd say that plateauing isn't just because "negative" influences like decay and interference. It's also that we're probably not feeding the system with enough novel stimuli that makes it stretch, grow, and adapt.

Expand full comment

Reading this comment made me think about playing tennis when I was younger. It always seemed that the best way to improve was to play someone who was just a bit better than you, such that you would win maybe 1 in 4 matches against them. Playing against someone much better or much worse than you was much less helpful.

Maybe in other fields with less immediate feedback it's harder to find that sweet spot. Which aspect of your field should you study next and how do you know in advance if it's just the right amount of challenging to promote optimal growth? Without the guidance of someone who has both a comprehensive knowledge of the field and of your current understanding of it, it seems like it would be difficult to consistently set yourself up for that kind of growth.

Expand full comment

Emotion plays a huge part in forming memories. If something makes you laugh, cry, rage, you're much more likely to remember it. Hence your memory of Master of Ceremonies. Perhaps new doctors stop learning when they become emotionally numb to new diseases?

Anki flashcards are much more effective if you use amusing images on the cards and funny concepts. (Hat tip to Fluent Forever book).

I'd be interested to see real data rather than just anecdote on the two languages thing. I'd also bet it could depend on the languages.

I speak several Latin languages which are all relatively similar. I leaned Spanish first, but then Portuguese, Catalan and French more or less simultaneously and ultimately found it best to concentrate on one at a time otherwise it's easy to muddle them as they're so similar. Something that wouldn't happen with Spanish and Chinese.

Expand full comment

Also, all reality are models on top of models. You have one model of a friend, probably the first, and all other 'people' are layers on top of the base layer that modify, like an onion. The brain doesn't remodel a new object, it takes an existing net, adds some neurons to add distinction, and calls it 'dave', when really it's mostly 'brad' with some Dave on top.

A theory could be that onion model could only get so big of models on top of models, limiting scope of career learning.

Expand full comment

The reason I think this is... I was on ecstasy and I did the breathing thing with a friend, where you deep breath, hold, and they squeeze chest so you pass out. The brain rebooted, and I saw the layers rebuild.

Expand full comment

Really interesting. A couple of other hypotheses I find interesting:

1) Habit formation

These guys look at teacher habit formation and find that the plateau in teacher effectiveness (which is very similar, coming around years 3-5) aligns to teachers' increasingly habitual behaviour. As a new teacher, you have no idea what you're doing. You start finding ways to deal with classroom, and because you do them repeatedly, they soon become habits. Habits are hard to shift, so you then end up sticking to what feels like it works. Most teachers then don't get much better for the rest of their career. https://bera-journals.onlinelibrary.wiley.com/doi/abs/10.1002/rev3.3226

2) On doctor experience vs quality

This systematic review found similar ideas to doctor quality, but put it more starkly. More experienced doctors are less likely to know best practice, less likely to adhere to treatment standards, and have worse outcomes. One explanation they offer is just that you get fresh training in the latest medical knowledge and practice during training - then you don't really update that for the rest of your career. I often think about this when I encounter old doctors prescribing out-of-date stuff. http://annals.org/aim/article/718215/systematic-review-relationship-between-clinical-experience-quality-health-care

3) Memories

The other thing worth considering here is the distinction between episodic and semantic memories. 9/11 sticks because it was a strong episodic memory - there was a strong emotion/feeling/experience attached to it. The same could never be said of the class you attended the day before....

Expand full comment

Final though - I'm writing this while being screamed at by my baby who should be going to sleep, and trying to keep my other kid happy. Before kids, I spent a lot more time reading about work, writing about work, going to conferences to talk about work, meeting people socially and talking about work. A big part of my personal plateau in my 30s is having discretionary time cut by about 90%!

Expand full comment

Re item 2: I suppose they actually do update their medical knowledge and practice after training...except that they update primarily via careful attention to their and their close colleagues' experiences with actual patients. In other words, their primary source of new knowledge is anecdotal knowledge, with very high salience but limited generalizability.

Expand full comment

The linked article by Ingraham shows a relatively wide distribution in age. There is not much difference in the number of works produced by 25-29 year olds and 45-49 year olds. True, there seems to be a peak between 30 and 40, but there's still a decent amount of good work being done by much older people. Indeed, if I take the median age, it seems to be around 40, which means half of these outstanding works are done by people older than 40.

It might just be that once you have created an outstanding work, you are unlikely to put the effort into creating another one. That would give some bias towards completing that outstanding work younger, but doesn't mean that if you haven't made one by 40 you never will.

As for the doctors, perhaps younger doctors are more open to new ways and techniques that have better cure rates? Older doctors are perhaps more likely to stick to what they know works, even if a new approach can work better and have a better cure rate.

Expand full comment

Or you use up your really good ideas. E.g., Shakespeare wrote "Hamlet" around age 35-37 and it's clear he was indulging himself because of how good he had gotten: it goes on for a ridiculous four hours, longer than Shakespeare's prior and subsequent plays. Shakespeare, who no doubt was a fine critic of plays, appeared to realize, however, that he was hitting his Career Year and let himself ramble on because, hey, he was writing "Hamlet."

Perhaps Shakespeare got lazier after Hamlet, but, you know, he was Shakespeare so I figure he did more or less fulfill his potential.

Expand full comment

This just in: memory very complicated, poorly understood :P

A further complication is that learning isn't just memorization - getting better at a skill, especially something like medicine or engineering, often involves noticing patterns and using them to create heuristics. Once you have the heuristic you no longer need to treat everything in its reference class as a one-off that you need to memorize the answer to.

And then often these build on each other and you can notice higher and higher level patterns. Certainly this has happened to me in my career as well as with other skills I've continued working on for many years, and it's more surprising to me that this apparently doesn't happen to doctors or doesn't help them to be better doctors than that their ability to memorize different diseases is limited.

Expand full comment

Medicine is different because progress often occurs in major shifts. 10 years later, the cutting edge procedure to repair a bile duct you learned with repetition and oversight as a fellow at a major teaching hospital has been superseded by a safer, more effective procedure that is completely different. 18 years later, there is a technique that relies on use of completely new technology that is so much better it makes your original approach almost malpractice. And this is only for the one bile duct issue. Logistically, how do you keep thoroughly relearning everything?

I’m a corporate lawyer and matters are entirely different. Yes, contract law changes over time, but it’s evolutionary not revolutionary. Maybe the latest Delaware chancery court decision has enormous implications for how material adverse effect clauses are interpreted in a hostile M&A setting. But it is adding to an existing framework or refining it, not upending it. The fundamental framework remains the same you learned in Contracts 101. Really, the foundation is built on case law from the 1880s, the framework has just been continually built out as novel problems have been presented to courts. That’s why corporate lawyers in their 60s are among those highly sought after to deal with thorny issues.

I think that people in the sciences have a skewed view of this. It’s extremely difficult to keep relearning new frameworks. But if your field allows you to learn one and tweak it continually as required, you can make important contributions until true mental decline sets in. Or at least until you get a sports car and a mistress and start spending all of your summers at your Montana ranch.

Expand full comment

Agreed that this varies across fields, but...

I'm a software engineer. Software is one of the fields that gets upended most often and where skills become obsolete the fastest. And yet senior software engineers are usually much more effective than junior ones. (Although, to be fair, there are not many software engineers who have been doing it for 30+ years, I'm not sure what happens to people at that point)

Expand full comment

There is definitely something to the interference hypothesis.

Back when I was a jack-of-all-trades kind of programmer I remember learning Ruby actively made me worse at Python, because the languages are pretty similar and I constantly messed up what will work in which language. At some point I decided to pick a scripting language and stick with it, only rarely using anything else.

Expand full comment
Aug 18, 2022·edited Aug 18, 2022

I hope you picked python ;)

Expand full comment
Aug 18, 2022·edited Aug 18, 2022

I wonder if language interference works differently for different people, and that’s why some struggle with learning multiple languages more than others. Learning Mandarin Chinese has seriously interfered with my knowledge of Spanish: it feels like it’s overwritten the “foreign language” part of my brain.

Expand full comment

Agreed! In school, my foreign language was French through 6th grade, then Hebrew in 7th and 8th grade, then French again in grades 9-college. French and Hebrew are nothing alike, but when I was learning Hebrew it overwrote my French vocabulary, and when I went back to French it overwrote my Hebrew. At some point I had learned enough French for it to leave the "foreign language" slot, and now I can learn something else without having it take up residence in the middle of French.

In fact, now I can learn Italian and *benefit* from knowing another Romance language -- which doesn't look like interference at all!

Expand full comment

The market tends to reward experience quite generously, more experienced workers can generally command higher wages. Is that a kind of market failure then?

It does correspond to my own experience, I'd say I peaked as a coder in the first year to year and a half of starting, when I was making a deliberate effort to improve. Since then only the limited set of skills I use for work get exercised and a lot of what I used to know has decayed. But then senior devs obviously get paid much better than juniors.

Expand full comment

In some fields, there's for sure a market failure with respect to paying for experience.

In programming, though, I think some of the more qualitative skills are quite valuable and take many years to develop. In particular, a lot of "big-picture" skills--project management, architecture design, and dealing with legacy vs green-field code--require multiple projects' experience to develop. Moreover, the feedback cycle for writing code is short, so it can be learned quickly, while the feedback cycle for knowing *what* code to write is long, and it takes more experience.

I believe there are other market failures in programming, though :)

Expand full comment

One thing I’ve noticed affecting my ability to memorize is the interconnectedness of the material. The more links it has to things I already know the more likely I am to remember it. And often, if I forget something, I can reinvent/guess it from the connections. Yet, if it was an arbitrary tidbit forgotten is forgotten.

Expand full comment
Aug 18, 2022·edited Aug 18, 2022

> Could you ask her to learn a second language, but secretly it’s just more Spanish words, and at the end you tell her she was learning extra Spanish all along?

I learned German in a non-formal environment, then I learned Swedish (which is very similar to German) and I found that I had trouble remembering the German words. Instead, the Swedish ones (similar but not identical) kept popping into my head. I kept trying to form German sentences, and coming up with a strange German-Swedish hybrid. Even the grammar got kind of confused. Interestingly, it didn't work the other way round - speaking Swedish worked without interference from German.

So, based on that, I'd reckon "close-enough" languages would interfere destructively (ie use the same brain cells and they get tired or overheat or something). Since Spanish is by definition really close to Spanish, your friend would end up saying "no idea what this Notspanish language is, but I just can't learn it after learning Spanish".

More anecdata: after learning something new and difficult, I get a strong urge to sleep. If I do (quick 5-minute nap) I wake up refreshed and ready to learn more. If I don't take that nap, I'm just done, zero ability to concentrate further on the topic. Maybe that would also reset the 20-word daily limit. I'm not curious enough to test it, but I am a little curious.

Expand full comment

Related in some ways (probably just in parallel) is the Weber-Fechner law:

“Weber states that, "the minimum increase of stimulus which will produce a perceptible increase of sensation is proportional to the pre-existent stimulus," while Fechner's law is an inference from Weber's law (with additional assumptions) which states that the intensity of our sensation increases as the logarithm of an increase in energy rather than as rapidly as the increase”

https://en.m.wikipedia.org/wiki/Weber–Fechner_law

Maybe relative to the previous level of skill one has, any improvement is likely to not be as noticeable? Which doesn’t relate entirely to the idea of skills plateauing to the point where peak performance is in the earlier years.

But perhaps an artists best work “peak” is relative to the rate of increase prior to hitting that point where people believe they are creating their best work. Maybe they do increase in skills beyond that point – whether it’s at the same % rate, faster, or slower – but regardless it’s just not as special enough anymore for people to consider the work as “best”.

Expand full comment

Back in the late 1970s, when baseball teams were paying a lot of money for famous free agents in their 30s, statistical analyst Bill James pointed out that ballplayers peak on average at the surprisingly early age 27 and decline rapidly in their early 30s.

For example, in late 1978 the Philadelphia Phillies made Pete Rose the highest paid player in baseball to play with them from age 38 through 41. James pointed out that fame tends to accumulate over the years (Pete Rose was extremely famous by 1978), but performance does not. Rose was a relatively late bloomer -- his peak year was 1973 at age 32, but even his will power couldn't resist time.

On the other hand, the Phillies won the World Series in 1980, even though the 39 year old Rose was pretty lousy by then. So whaddaya whaddaya?

Expand full comment
Aug 18, 2022·edited Aug 18, 2022

I run a memory company called Save All and have many issues with this article:

------------------------------

> The curve above suggests you should remember things for longer each time you review them, eventually remembering them the whole rest of your life. But if this were true, doctors would gradually add to their stock of forever-knowledge and get better with time.

The problem is that they don't review everything... if they did then your logic is correct, but they don't. Everyday they (1) partially forget the things they don't review and (2) remember the things they do review so there can easily be an equilibrium where (1) > (2) and total knowledge is falling.

Also note the curve only suggests you remember things for longer each time you review them IF you review them at the RIGHT time (the red dotted lines)... If you review them far too late then this isn't true either.

------------------------------

> Meanwhile, I still remember a wildly uneven set of facts from my high school history classes, even though I’ve rarely reviewed any of them since then. Something else must be going on.

Every time you remember something you ARE reviewing it! That is what spaced repetition apps like Save All and Anki do, they trigger you to recall the information from your memory which is what a review is!

------------------------------

> This suggests an interference hypothesis: once there are too many similar things in memory, they all kind of blend together and it’s hard to learn new things in the same space.

This may be true to some extent in some specific artificial laboratory settings, but in general this is completely incorrect. Generally the more we know about an area the easier it is to remember new information in the area. This is because our brain organises what he know into a "schema" and any new information that fits nicely into our "schema" is much easier to remember than something that doesn't fit nicely into it.

e.g. Imagine you are in a cafe listening to a man talking on the phone in Chinese (or any other language you don't know). If I asked you 30 minutes later to recall the Chinese words the man had said you would have no idea. The sounds did not fit into your schema so you forgot them almost instantly.

Imagine the same situation now but the man is talking in English. Maybe he was talking to his wife about what to cook for dinner tonight, or maybe he was talking with his friend about a film he watched etc. . You'll be far more likely to remember this 30 minutes later because the information fitted into your schema.

Expand full comment

In addition to 'fit into your schema' the other thing is that when listening to language that you recognise you use a different neural pathway than the one you use for 'listening to sounds in general'. And the language-you-don't-know gets the 'just sounds' pathway. This is one reason why it is more important for new language learners to learn the set of necessary sounds in their new language rather than an infodump of vocabularly. See Stanlislaw Dahne 'Reading in the Brain' for a description of how this works neurologically.

I think this would make a great scene in a spy movie. 'You say youdon't speak Russian, Comrade? But here in my MRI machine (which I happen to have borrowed from the future, but nevermind) I can see that this is clearly a lie ...."

Expand full comment

I don't understand why that would mean it's more important to learn the sounds than the vocabulary first. (I can think of arguments for doing so — I just don't see how *this* is a reason.)

Expand full comment

Like learning to read by first learning the letters of the alphabet?

Expand full comment

You need to learn what the letters are, yes, but if you don't also learn the (possibly multiple sounds) the letters can make it won't work very well. If you are learning a language that uses the same symbol set as one you already know, you will use the old language sound set with your new language. Bingo, you now speak with an accent, and will have to work on improving that.) But if you are learning a completely new symbol set, somebody will have to tell you what the letters sound like. If you are learning a language which has 2 forms, one more phonetic and one more a 'memorise this bag of characters', like the distinction between kana and kanji in Japanese, you will do better if you learn the kana first.

Expand full comment

Language in your brain is sounds. (Unless you are deaf from birth.) We now know that when you are reading, you are hearing the sounds in your head of the way the word is pronounced. It's not word-to-concept or word-to-symbol or any of the other things we thought might be happening. It's sound processing before it is anything else -- see Reading in the Brain for the details. So it is easier to learn vocabulary when the words you are reading have the correct sounds bound to them in your brain. When they don't (because you don't know what the sounds are) you find it harder to recall them. This has been tested in language labs again and again, but now we are getting a good bit closer to understanding why.

Expand full comment

> when you are reading, you are hearing the sounds in your head of the way the word is pronounced

Nope, for me this isn't true. Except if I'm extremely tired, then yes, I mentally "spell out" the words I read. Otherwise, I see the letters and visualize the concept without "hearing" them in my head.

It's hard and awkward to describe one's own mental processes, especially to people who don't seem to operate in the same way. It's so easy to say "everyone has a mental monologue" or "you are always hearing the sounds in your head when reading" but it's just not true, and it's extremely frustrating when people assume everybody else thinks just like them.

Expand full comment
Aug 18, 2022·edited Aug 18, 2022

I am sorry if I was unclear. This is not about 'hearing the sounds in your head as you read'. This is about 'come to my lab, sit in the MRI machine, and read something while we record what areas of your brain are getting activated. See that bit? That is your brain converting what you are reading into phonemes -- sounds.

Your brain is operating on language-as-sounds not language-as-ideas or language as visual symbols, whether or not you hear your own voice in your head as it is doing so'. And, aside from deaf people, everybody does do this, you cannot read without using these pathways in your brain.

Expand full comment

Sorry, that post happend too quickly. Neurologists are not, generally speaking, interested in the subjective experience of consciousness. It's the objective 'things are happening here, here, here and here in the brain and we can measure this' that they are interested in. So I wasn't talking about the subjective experience of reading, or assuming that anybody has one like mine, and I apologise for not making this clear.

Expand full comment

Thank you for the thoughtful reply. It is indeed much clearer now (as far as one can say that about such a complex topic).

I do wonder if, when you plugged me into the mri, those areas would light up. More to the point, whether the experience of reading "in flow state" involves the audio-processing circuitry. Maybe it's somehow back-activated? Meaning, I read the word, am only aware of its meaning, but somewhere in the background the audio processor gets triggered too, just for the heck of it.

I also wonder: what happens with deaf people? Do they activate their motor cortexes, or maybe the gesture recognition circuits (if there even is such a thing)?

Ok, they're probably dumb questions, but this is fascinating :)

Expand full comment

"Every time you remember something you ARE reviewing it!"

Yes, but why do you remember some things but not others? It's clearly true that, the more you review something, the more likely you are to remember it. But for the subset of things you review equally infrequently, you will remember some of them but not others.

Expand full comment

Inorganic and organic chemistry: Did you really want to remember the details of either? I find it easier to learn things I'm interested in; perhaps you were more interested in knowing the basics of both than in the details of either.

Or perhaps you encountered references to the basics of both fields of chemistry when studying other fields (or even each other), which helped reinforce the memory.

Expand full comment
Aug 18, 2022·edited Aug 18, 2022

I am bemused (after reading A Guide To Asking Robots To Design Stained Glass Windows) that you thought 'had a Moose' was the most salient fact about Tycho Brahe (besides being the founder of Astronomy). I would have thought that having an artificial nose made of silver to replace the one he lost fighting a duel at university, age 20, over a point of mathematics was the one thing everybody remembered. This may have some effect on why DALL-E can draw Herschel better than Brahe, or would rather draw Santa than Brahe, though. Tycho Brahe's long, very pointed moustaches (of which he was extremely vain) also seem not to have made an impression on DALL-E.

We have evidence that there is only so much vocabulary a person can learn at one time. And that it is better to drill vocabulary on your cell phone, for 10 minutes, several times a day rather than take language classes every day for 40 minutes, or worse, every week for an entire evening. Traditional school classes were not designed with 'learning languages' in mind. They are a particularly bad fit this, whereas other subjects tend to have a longer window before the student just isn't learning any more. We think this is because a teacher has a way to tie the new knowledge that you have with the old knowledge you already had, and thus make more (and more interesting) connections. When starting to learn a new language, there isn't much there to work with. If you already have several languages, at least if they are related to each other, you have more connections to make to old knowledge -- aha, same root as in German -- which will help explain why people who speak multiple languages learn the next one quicker. But not always. Knowing French and German makes learning Swedish go faster, but not so much learning Finnish or Mandarin.

It may also explain why some very good science students get to Organic Chemistry and then do poorly, often much worse than the less good science students. Their study habits do not revolve around 'memorising a lot of new facts', but rather reasoning about new scientific facts they acquire and slotting them into a large corpus of their own scientific wisdom. These failing students are testably worse at memorising a set of new random items than the average student, and a lot of Organic Chemistry is presented as if the ability to memorise new facts, not connected to anything (yet -- we will get to that later) is effortless.

To combat that particular problem, we made up songs about Organic Chemistry nomenclature, and discovered that most students could learn a song, and remember nomenclature that way where they could not remember a list -- or the lyrics of the song not set to music. So apparently music, rhythm and rhyme matter when it comes to learning new things, and are good things to try if you need to pack things into your brain faster than your brain wants to retain it.

Expand full comment

I think there's false causation in the premise of the question

> creative artists, on average, do their best work in their late 30s.

Skill is NOT all you need to create your best work, oversimplifying

Produced work = skill x free energy to spend x interest in the area

Skill is the only thing in equation that (sans some mental deterioration in the old age) is strictly increasing.

Interest in the area usually inverse-U shape

Free energy to spend peaks in young age and then only decreases over time (both with less energy total and more things to spend that energy to - family, health, social connections etc)

Expand full comment

There has to be a sweet spot where people are new enough to the field to still be enthusiastic, but also experienced enough not to make beginner mistakes. It wouldn't surprise me if that was generally in the early thirties for most professions, even if that wasn't the physical peak of many people.

> A natural objection is that maybe they’ve maxed out their writing ability; further practice won’t help. But this can’t be true; most 35 year old writers aren’t Shakespeare or Dickens, so higher tiers of ability must be possible.

If there is such a thing as natural talent, most people's "maximum ability" is probably far below the peak of human achievement. So just because they don't reach the level of the top performers in their field, doesn't mean they haven't reached the top level that is reachable *for them*.

Expand full comment

The actual reason for this is poor statistical methodology.

If you are doing something where you are either right or wrong, you can't get past perfect. As there is often some noise, this makes it look like you are seeing diminishing returns when in reality you are just too close to the top of the scale to see further improvement.

Any model where the top of the model is "perfect" will always appear to have declining returns over time.

Looking at artists and writers, they often get better for many, many years; they do eventually stop getting better, but it seems like it caps out at different points.

Expand full comment

Malcolm Gladwell had a podcast a few years back on memory, and part of it included clips of people talking about 9/11 one year, then 10 years later. Amazingly, many people had different stories (often incorporating pieces of other people's stories into their own as time moved on. Yet, even when confronted with their previous stories, they were sure the new version was correct.

Expand full comment

I think one of the main factors about this is motivation and interest.

If you are new to field be it at school, at your profession or at your hobby, you focus on it and it takes all your attention. Most people can't keep up the same focus for years, so if you look statistically and not on a individual level, you will always see these plateaus saying nothing about the individual abilities. It is fully explained by the most peoples life's going on, adding distractions or turning their main effort to other topics when they feel they mastered their field e.g. when people established their carreer, they get a family, build a house...

Another factor could be that the people get wiser, realizing that the career and being best in anything special isn't fulfilling, so they loose a lot of ambition and focus on enjoying life.

Expand full comment

So I think something which cuts against this hypothesis is how much more efficiently and quickly language learners absorb a language in an “immersion” environment.

This, to me, indicates there’s really no cap at all. On the contrary it’s pretty much a more repetitions = better situation (perhaps with some diminishing returns).

Expand full comment

Or language acquisition is a special instinct distinct from most other learning?

Expand full comment

Yeah this is totally on the table. Some more support for the “language is it’s own thing” hypothesis is how poorly IQ correlates with certain domains of language acquisition like listening/speaking (IIRC it does correlate with reading/writing).

Language acquisition is very strange.

Expand full comment

The flashcard part of the article concerns me a bit. Is there evidence to suggest that just doing flashcards is a good way of learning? I would think that some flashcards followed up by usage of both new and old words would be superior.

But I suspect that would take a lot more time daily, so maybe it's less efficient overall.

Expand full comment

Yes, this confuses a lot of people on the topic of English as a Second Language business: at public schools where all the little kids speak English on the playground, lots of English-language learners never can pass their ESL written tests even by age 14. But that's not because they haven't learned to speak English -- they speak English fine -- it's just that they can't pass any of their tests because they aren't very bright. But their below average intelligence didn't cause them much trouble at learning to speak English.

In general, IQ correlates pretty well with ability to learn a subject, but learning a language from other kids on the playground before puberty seems strikingly easy.

Expand full comment

In the case of creative artists, "skill" is something that they acquire young, at least in the technical sense. As writers like Shakespeare and Tolstoy got older, they "learnt" new things from research to produce their later works, but, for example, Tolstoy's researches on the 1812 campaign just enabled him to write War and Peace. That didn't mean he was a more skilful writer. Thomas Pynchon wrote his masterpiece Gravity's Rainbow, when he was in this thirties. He "learnt" a lot about other topics later, to write other books, but that didn't make him a better writer either.

And unlike doctors, perhaps, great artists have a fundamental talent which they are born with: Mozart is the paradigm here, but Leibniz, devouring the contents of his father's library when scarcely out of nappies, is another. Such people are born with all the talent they will ever have. This is refined, as they grow older, and often produces (as with late Beethoven or late Shakespeare) an almost casual approach to art: experience has refined their talent to the point where they can do anything, so they please basically themselves.

Expand full comment

Fun fact: airliner pilots, no matter how much they train, can only be rated to fly two airplane models at any time. If they train into a new one, they must pick either of the existing ones to lose rating to.

Expand full comment
Aug 18, 2022·edited Aug 18, 2022

I can see that being reasonable depending on how many flying hours per month you need to keep your rating in a model, and how difficult it is to regain an "expired" rating.

Also how much union rules affect this

Expand full comment

Interesting issue. Besides your points, one relevant factor for males might be serum testosterone.

Expand full comment

I thought a mechanism “how much you can learn per day” was already known, in terms of sleep being the time short term memories are transferred to long term storage. If you’ve only got so much short term memory space, you can only transfer so much to long term storage.

Expand full comment
Aug 18, 2022·edited Aug 18, 2022

Minor point, but I feel like chiming in with my experience on memory techniques. The short version is that they "work" but aren't some kind of miracle, and the actual use-cases are pretty narrow.

After a few months of study I could memorize the order of a deck of cards in 10 minutes. This involved two things: Memorizing images related to cards (7 spades = Sock, 3 Hearts = Ham, etc.) and having a memory palace ready (https://artofmemory.com/blog/how-to-build-a-memory-palace). This doesn't really help with actually long-term learning for a few reasons:

1. The decay is pretty fast, although presumably spaced repetition would help with that. But then I'd need a bunch of memory palaces to not get them mixed up. And then we run into the limits you described. Competitive memory-people take a week-long break before competitions to clean out their memory palaces; aka let decay do its job.

2. Memory palaces specifically work by ordering. I could list of all the cards in order, but would have a much harder time telling you the 37th card. Though this is probably fixable by doing a better job memorizing the memory palace with numbers attached, it still just associates number -> card. I don't know exactly what it takes to be a good doctor, but it sounds very different from simple lookups. The right reference material can always handle that.

3. The DOMINIC system (and similar) helps you remember the order of numbers, it's not a general purpose memory improver (or if it is the effect is very small). I could remember decks of cards because I invested time up-front in memorizing card-image associations. Higher up-front cost, lower marginal cost. Every technique I've seen has that pattern. Maybe something like this would help doctors remember their patients better[0] but only the ones with common symptoms and illnesses. And that's not any better than just keeping good written records.

[0] A common use-case for memory techniques is remembering peoples names and faces; good for teachers/professors and such.

Expand full comment

I once realized that I often had the problem that I wondered which date of the month it was even if I tried consciously noting it in the morning. So I made a habit of linking the picture from my "major system" list to a picture linked to the weekday. Eg last monday it was a certain kind of bird that stands for 15 doing something with a screwdriver (because manual work stands for monday). So having basically linked a specific day to two things and the things to each other for that day, it reaches my working memory and i can retain it during the day.

Also, when I want to remember a list of several things, eg the names of Italian provinces, just remembering the list is hard, but adding more details can make it easier, such that they get some meaning. Finding the right amount of detail is not easy, and memory techniques can make it easier to just use the same additional details every time, and then you already have an order for them.

So yes, it is about up-front investment, but not ONLY about order. It is also about detail and conscious remembering.

Expand full comment

Additional confounder: subcultures (for creative stuff) and paradigms (for science)

Young creatives might join a radical subculture they deem to be the next big thing. Well established creatives in their 50s are not incentivized to alienate their audience and try their hand at some fad which might as well fizzle out. Scientists in their 50s may produce good normal science, but are unlikely to break away from the current paradigm, should the opportunity arise.

Also, most people probably don't have "maxing out my $FOO skill" directly in their utility function. Perhaps writers are more interested in telling their stories to audiences than in achieving literal perfection. Using a skill might not be the most effective way to level it. I have a driving license and am a non-exceptional driver. I certainly leveled my driving skill during the first year of owning a car. If I really was interested in improving my driving, I could take more lessons or join a race track or whatever, but the utility of another level of Pilot Ground Craft is just not worth it to me.

--

I am somewhat skeptical about the claims of physicians skills leveling off after one year. Take surgery. A MD starting out will have about zero experience with it. I doubt that after a year, they will be a world-class brain surgeon. For obvious reasons, it is impossible to do RCTs on surgery. One might test the diagnostic capabilities of MDs by having actors describe symptoms. Or have physicians take tests about clinical cases.

--

With regard to memories of hearing about the 2001-09-11 attacks, I think memory is reinforced (or even overwritten) by remembering. While I do not doubt that such events form strong memories, I think I mostly remember what I remembered when thinking back for the first time. In 2002, I did not find it important to remember what TV channel I saw the twin towers on, so now I can not remember that.

Expand full comment

I wonder if there's also an issue of background environment change. We live in a particularly fast changing environment relative to our ancestors. I wonder of people getting "set in their ways" prevents them from adapting to new environments, which is what they'd need to do to improve.

Expand full comment

A somewhat related anecdotal report:

As a kid, I picked up chess playing as a fun thing to do. It was mostly fun because after some practice I started beating the adults in the area.

My skill level froze right there when nobody would play me anymore. Not enough practice with challenges to my skill level caused a plateau.

As I lost interest in the game over time after that, I eventually started running into people who could trounce me once I became old too, but not playing more than a few times every several years also prevents advancement.

Somewhat off topic, analysis indicates my play style messes with other humans and causes them to error, as I've rarely beaten machines even when I was still practicing.

Expand full comment

I recommend the book Moonwalking With Einstein. It's about a journalist who was challenged to win the US Memory Championship, starting from basically no talent/training. I won't spoil the book, but I will say by the end he was able to memorize a randomly shuffled deck of cards in under a minute, which is insane.

He also interviewed the inspiration for the movie Rainman in the SLC public library, who reads phone books and memorizes them practically instantly.

Later he interviewed a man whose traumatic brain injury left him unable to convert short-term memory to long-term memory. In one of the book's more humerus moments he asked this man he's just met whether he'd like to go for a walk. Nope. Five minutes later ... Care to go for a walk? Sounds great.

The crazy thing is that the guy knows his way around the neighborhood, even though he and his wife moved there after his injury. He's not even aware he knows his way around.

I think one aspect of the memory model needs to account for different kinds of memory/learning (e.g. location/spatial learning, traumatic experiences, scents, etc.).

I think there's also a cataloging effect. There's the well-known study of chess masters who were shown chess positions and demonstrated superior recall commensurate with experience. This effect disappeared when positions were truly randomized (including positions impossible in a real game), suggesting the chess masters are structuring their learning and recall.

Expand full comment

What's up with that forgetting curve graph? Why is "60 minutes" a separate point from "1 hour"?

Come to think of it, that's the sharpest dropoff I've ever seen on one of those, suggesting the need for 3-4 reviews on the first day alone...

Expand full comment

You proposed decay caused by people forgetting what they used to know, but there's also decay from a field itself changing. This is sometimes called the half-life of knowledge: https://en.wikipedia.org/wiki/Half-life_of_knowledge.

I experience this as a software engineer. All of the tools I use and the products I work on are continually being updated at different rates. There are areas that change less quickly, where it's possible to acquire deep knowledge over decades. But many of the things that I have to keep in mind on an average day are relatively ephemeral.

I believe many other types of knowledge work have a similar problem. The state of the art in the field changes rapidly, so you have to spend long hours studying and learning new material just to stay at the same level of expertise. Perhaps that's why we say that doctors "practice" medicine, or that lawyers "practice" law.

Expand full comment

A good theory is about efficiency: older professionals don´t get better after 10 yrs, in the core material exercise of their field; but they do it better, with less resources and are way more efficient (in doing it, not in general % results). After 13 yrs as a lawyer, i don´t generally get better (perhaps 1% a yr), because the theory/practice changes all the time (the law itself, jurisprudence, dogmatics, etc.), but I get better at doing it: it takes me less time to make the same ammount of work, in equal standard. So it varies within fields of knowledge, practice and efficiency, but not in its merit.

Expand full comment

As weak evidence for The Interference Hypothesis, I can attest that after 90 minutes of Japanese anki practice I would be sick and tired of Japanese and unable to retain any new words, but I could then switch to computer science or reading papers without much issue.

It's actually a 'life hack' of mine to be more productive: procrastinate one subject by studying another as a form of rest.

Expand full comment

Interference is a leading theory for where working memory limitations come from. I tweeted about a (tough but interesting) paper comparing leading theories some time ago: https://twitter.com/mpershan/status/1219787760078331907

Expand full comment

Remembering can cause forgetting.

An interesting implication of interference is that successfully recalling one thing will induce forgetting of similar (interfering) things. https://www.researchgate.net/publication/15268332_Remembering_Can_Cause_Forgetting_Retrieval_Dynamics_in_Long-Term_Memory

Expand full comment

The human body modulates its capabilities depending on use; why would memory be any different?

The base motivations are likely the same: humans don't constantly keep optimal "fit" bodies because it costs a lot of energy, and a high base energy burn = faster starvation in times of dearth -> memory also costs energy, and "wasted" memories just clutter up the brain and increase systemic energy use.

Another dynamic is reinforcement: something that is used over and over again, is not forgotten.

And another dynamic is "strength" of memory: 9/11 was a powerful emotional event, it seems intuitive that powerful memories would result. Ditto the hatred against that specific Hebrew word constituted a far greater emotional strength than the passivity which the rest of the dictionary evoked.

Combine the above physical and mental: maybe that's why jocks tend to be dumb. How much mental reinforcement can you have if you're spending 3-5 hours a day exercising? It might just be a function of optimizing the physical at the expense of the mental... lol

Expand full comment

I think the interestingness factor can just be explained by spaced repetition. You reflect back on 9/11 frequently which locks in the memory of where you were/what you were doing at the time. You have no reason to look back on any of the other dates surrounding 9/11. I think this explains retention of most memories. If you often look back on them, they will stick.

Expand full comment

So similar to the mnemonic device is the memory palace, where you create a large, extremely detailed "physical" space in your imagination and then associate various pieces of information with each detail in the space. The idea is to tap into our preternatural ability to remember WHERE things are, especially in relation to other things, as a hack to remembering whatever we need to remember.

One cool aspect of this technique is that you can reuse the same "place" to remember different pieces of information. For example, if you associate, say, the date of the attack on pearl harbor, with a particular tile on the floor in the hallway on the way from your imaginary sunroom to the gameroom, you could also associate some other piece of information, like the german translation of the English word "refrigerator", with that SAME tile.

I may not remember WHEN certain changes happened, but I can remember exactly what my room looked like in my childhood. Even when the furniture was rearranged, my toys were completely replaced by a different set more appropriate for my age, and when I finally got an old (like knobs instead of buttons old) tv to play video games on, the configurations of everything and where everything was, are still as readily recallable as if I still lived there.

Couldn't tell you exactly which books I had, but I could tell you the main types and where they were, what they were next to, what color the lamp was, where it was situated, where the bed was, the night stand, where my toys were stored, my dresser and which clothes were put where, which print my bedspread currently had, and how many blades my ceiling fan had. I can remember all of this even though it's been over 30 years since I lived in some of those places.

If I were to associate some piece of information with each of these innocuous details, chances are I'd probably remember most if not all of them.

I bring this up because I'm now curious to know what the literature says about SPATIAL memory. How quickly do we forget where things are? What about in relation to other things? Does the amount of intent in placement factor into how well we remember where we put things? I know there's a book written by a guy who studied a bunch of mnemonic devices on a whim to see if they worked and ended up winning the world memory competition that year. Are there other types of mnemonics we haven't tried yet?

Expand full comment

Re, the ritalin study: are people using stimulants like ritalin and aderrall to learn? Is that their intended purpose? I find that midly insane. I thought ritalin and aderrall were for doing work - increased focus and speed is great for finishing that essay, making progress on your software project, finishing unpleasant tasks like cleaning, etc. - but I truly did not know that people were trying to use those drugs to learn new information as their primary purpose. If your brain won't attach to the information to be learned without stimulants, then they could allow a student to focus *at all*, but I'm really missing something with the hypothesis that stimulants would improve storage and recall in users.

Expand full comment

Yeah. Why can't they just drink coffee?

I think using these drugs is very dangerous in long run, while have I have no problem drinking coffee. Old-timers bias, probably.

Expand full comment

Haha it does sound mostly like old timers bias to me! Coffee is no where near as effective as a stimulant. I just want to know - if people did expect stimulants to improve memory/recall, where did that expectation came from?

Expand full comment
Aug 18, 2022·edited Aug 18, 2022

“a mathematician is a machine for turning coffee into theorems.” attributed to Erdos (an avid coffee drinker), but apparently it was Alfréd Rényi. See also https://hub.jhu.edu/2014/01/12/caffeine-enhances-memory/

Maybe theory was/is that if coffee is good for you then someone mixing up a batch of medicine has to be better. Speed use in 70s and 80s to ostensibly pull an all-nighter - ridiculous but I remember people trying it.

But there has been research, that shows that amphetamine use can actually lead to memory loss. See https://las.illinois.edu/news/2009-11-01/amphetamines-and-memory-loss#:~:text=Drug%20abuse%20in%20adolescence%20may%20impair%20adult%20working%20memory.&text=A%20recent%20study%20by%20U,ve%20stopped%20taking%20the%20stimulant.

I think there is a silent epidemic of college kids getting ADHD medications for "study enhancement" from unscrupulous doctors when they don't actually have a diagnosis!

The history of nootropics and stimulant use for ostensible mental enhancement (and the obvious dangers) would probably make an interesting topic. I've heard the Nazis were speed freaks. Probably a lot of dangerous snake oil.

Expand full comment

So what's Hebrew for master of ceremonies?

Expand full comment

מנחה

Expand full comment
Aug 18, 2022·edited Aug 18, 2022

I have recently been thinking that between 35 and 55, people are at their peak douchebaginess. They know stuff and have developed skills but they also massive overestimate their prowess. Dunning-Kruger effect. Related to this article, I'd propose that because they overestimate their abilities, they coast on improvement activities. Unapologetically, a boomer.

Time is also fixed, so it can get filled up with things that are unrelated to improvement of "skills": parenting, household and administrative stuff.

While my grandfather used to say, "we aren't getting any older the kids are just growing up" The reality is we are getting older. While I'd take a more experienced surgeon over a less experienced one, there is a point which I would not want an older surgeon operating. (It's good that we limit airline pilots to under 65.)

Expand full comment
Aug 18, 2022·edited Aug 18, 2022

Dunning-Kruger's been called into question, but if you're using that, people 35-55 should be at minimum overestimation; it would be the younger people and older people who know less and therefore think they know more. You're not necessarily wrong, but you need a second term for that equation. ;)

Expand full comment

It is curious. They should be at minimum overestimation, yet they aren't. Hence, maximum douchebaginess. But I could definitely be wrong as I am getting old.

Expand full comment

> Economist Philip Frances finds that creative artists, on average, do their best work in their late 30s. Isn’t this strange?

I'm not sure it is?

"One-hit wonders" and people who fizzle out and quit early surely are a large proportion, compared with people with careers that increase in popularity.

We already know we don't live in a world where all the chart-toppers are produced by the elderly.

And don't averages tend to land near the middle of a range? Late 30s is right there mid-career.

Expand full comment

The most obvious missing alternative is what I'll call the Experience Hypothesis. One starts out without experience. There's training and what I'll call guided practice, doing but with a teacher or guide to fall back on. Lack of experience is part of the learning curve, but what happens when one acquires experience, particularly successful experience? For one thing, the job becomes easier, even the job of creating. Why is it easier? It is easier because it has been done before. Some patient already had that problem and such and such helped them. Some visual idea was similar to some other idea and could be expressed with similar metaphors. Some situation presented a challenge, but elements of that challenge were soluble.

The way one works and thinks changes. There's a shift as experience accumulates. It is easier to solve problems, address new challenges and produce at a high level because it is not the first time. The physicist Richard Feynman said that his genius consisted of figuring out a few techniques and applying them in novel situations. This is true in just about every field and is generally accepted, but it has a down side.

The problem with experience is that it gives one a sense of what to expect and what to do about it. If you are into machine learning, it's like setting Bayesian priors probabilities. This is a limitation in human learning as well as machine learning. So much depends on the training set and the order of presentation. Experience is path dependent and that path limits where one can go in the future without a major breakthrough.

Expand full comment

Here are my humble thoughts on this topic.

We tend to remember things that are associated with strong emotions. That's why Scott remembers the word for Master of Ceremonies, because it made him angry. One thing that evokes strong emotions is novelty. See, for example, Ekman's model in which surprise is one of the basic emotions (wow, Ekman is still alive!). So if you are not running across novelty, I would guess you are less likely to remember things. Another thing that is important is motivation, which I guess you could associate with the enjoyment emotion.

Novelty and motivation are two things that decrease with age in the same profession. Let's add two more factors that play into this. The first is resistance to change as people get older. Most people become set in their ways and settle into routines. Novelty becomes stressful and is avoided if possible. The other thing is crossing the threshold of the midlife crisis. This is the realization that you are in the second half of your life and you can no longer make plans into infinity. Goal-oriented behavior with long term goals becomes less important and there is a shift in focus from telic to atelic activities. Unfortunately, for many people, this means less interest in learning new things.

Expand full comment

I think it's most decline in fluid intelligence, energy, and motivation, which offsets the increase you get in crystallized knowledge.

If you look at someone like Warren Buffett, who is working in a field that is mostly about crystallized knowledge (and psychological discipline), he's had most of his success since he was 50, and a lot of his best investments since his 80s (he's up I think $100 billion+ in Apple, which he made just a few years ago). He of course did very well when he was younger too but he didn't stand out from his peers quite as much and he did it with a much different style of investing that wasn't possible later on. Most of his peers retired at a certain point to enjoy their wealth but he seems to be obsessed with the game.

He gets on stage at the annual meeting and seeing him share his encyclopedic knowledge of businesses accrued over the decades is like watching the geoguessr kid. You can see how after 70 years it's probably enough to offset any slowing elsewhere in his skillset.

I think with other fields it's more about fluid intelligence and energy and so on. You can see that most of the really young entrepreneurs that hit it big do it in a really new field where they don't face entrenched, experienced competition (think Gates or Zuck). There are a lot of fields that more favor older entrepreneurs, like Sam Walton didn't start Wal-Mart until his 40s, after getting a couple of decades of experience in retail first.

I'm skeptical of the decay hypothesis. Vocabulary pretty much increases throughout your entire life, for example. I think there are a lot of explanations for peaking in your 40s that are more plausible.

Expand full comment

I think most creatives who reach a certain level of success recognize the improvement in their work due to access to new resources and the support of the audience. These are transient. As they diminish, the creator experiences diminishing returns on their work, and their enthusiasm, if nothing else, is likely to diminish as well.

Expand full comment

I suspect you are conflating two different phenomena. One you discuss with your physician examples.

The other applies to a writer, poet, or scientist. He brings a new approach to his field, exploits that new approach to produce something new and different. Having done so, his approach is now part of the field, so further use of it will not feel new.

I remember my father suggesting, along those lines, that some very productive scientist in about his late thirties should retrain in an entirely different field to see if the result would be a new burst of creativity.

Expand full comment

That "scalloping" pattern you showed in the first figure is common in perceptual and motor learning as well -- but what isn't shown (because it wasn't your topic) is that in these forms of learning that tuck things away into automation every night as we sleep there is also progressive improvement. So while fact learning, language learning, and episodic memory have the limits you described -- process learning can improve pretty much until there is no more room for improvement (perfect walking, for instance).

Another comment -- thanks for the she/he generic person gender switching!

Expand full comment

It seems clinical cognition may begin to decline after the age of 35, but then strange patterns begin to emerge. When you get over 65 or 70, new strains of cognition seem to show up.

Conversations had 46 years ago finally make sense, and one frets because his recalled response seems inadequate. It's like a joke it took you decades to get.

It's similar to driving. When you're young, you're quicker, more adept. When you're old, you take the back streets and wander a bit, and avoids instances where one needs quick reflexes. It's great for researching archives, though, because you search differently -- almost with new eyes -- and find things you overlooked decades ago.

Expand full comment
Aug 18, 2022·edited Aug 18, 2022

As seen on the Internet today.

"My mind is like an open browser with fifty tabs open. A few tabs have timed out—a couple are displaying 404's not found—and I have no idea where the music is coming from!"

Expand full comment

Like some other commenters, I would look to satisficing, distraction, and risk aversion as the main causes of plateauing, rather than intelligence and skills specifically. Unlike perhaps theoretical physics, most fields of work don't require peak level fluid intelligence. Doctors are probably happy enough with their skill level and distracted by the craziness of the healthcare system as well as everything else. Professors and artists are probably distracted and, I would imagine, scared of looking bad by being too creative. It's easier to let yourself be typecast than to continue to explore new areas and open yourself to criticism.

It would be fascinating to see better data sets on performance versus aging. I would be curious about investing, management (e.g., CEOs), and writing.

Expand full comment

These explanations feel dissatisfying to me -- isn't skill in many of these domains (e.g. creative artists) about much more than memorizing things?

Expand full comment

I don't have anything directly relevant to say, but I encourage people to read Atul Gawande. I particularly like his essay the Shouldice Hernia Center that takes 1 year to produce the best hernia surgeons in the world.

https://utmedhumanities.wordpress.com/2014/10/14/the-computer-and-the-hernia-factory/

Expand full comment

Was pretty interesting. But it feels a stretch to put hernia surgeons and computers in the same bucket. He's equating specialization with automation. While there are similarities between the two, in that essay it feels like he just hopes you don't notice the distinction.

Expand full comment

Yes, when I read the essay I filed it in my mind as two essays on different topics, both interesting and important, but quite separate. I don't remember what he says to link them and I may well disagree with it.

Expand full comment
Aug 18, 2022·edited Aug 18, 2022

I think there are at least a few examples of people *not* having a mid-career peak and instead just keep improving (if slowly) until death or a debilitating condition takes them out of the proverbial game entirely.

For example, in my subjective opinion, Terry Pratchett's books tended to get better, not worse, as he became older and more experienced - even those he wrote as he started to suffer Alzheimers disease were at peak quality. (He had an atypical disease progression; the brain functions most relevant to speech and creative writing were among the last to decay.)

Another example: Leonard Euler continued to produce novel mathematical insights well into old age.

Expand full comment

On the contrary, I though Pratchett's books were just getting worse. When I heard he had Alzheimer's that explained it. Unseen Academicals and on are terrible.

Expand full comment

As someone in his 40s...fuck.

Expand full comment

I think authors and doctors are both special cases.

Authors have a culture and an industry which discourage any improvement after first publication. Before they "break in", they workshop, read books, attend critique groups; they know they're not good enough yet. After they break in, that motivation vanishes, and most authors don't seem to keep experimenting or getting better.

I've done stylometric comparisons of fan-fiction authors and famous professional authors, and the most-obvious difference between them is that many of the best fan-fiction authors experiment a *lot*, writing in different genres, with different rhythms, styles, points of view, tenses, everything. Their different stories are all over the place in any projection of a high-dimensional stylometric analysis (which typically defines each story-datapoint as the vector giving the frequencies with which every common English word is used in that story). Professional authors, by contrast, hardly ever experiment once they've been published. All their stories cluster together in a small spot in the projection, and they usually use the same point-of-view and tense for all their works. I found only a single case that I remember where one pro author's works overlapped with those of another author--Truman Capote and Harper Lee. Turns out they grew up as next-door neighbors and worked on their books together.

Most authors and editors advise younger authors never to read their reviews. Nor do they get much feedback from their customers; print publishers don't do a single thing to make it easy for readers even to contact writers. Nowadays there is social media, but that's recent enough not to be observable yet in a data set of authors.

Doctors learn a lot in medical school; not so much after medical school. Medicine is on the edge between changing slowly enough that you can learn on the job, and changing so quickly that everyone in the industry makes continual efforts to stay current. Some doctors try, but Scott wrote that most rely on pretty women handing them free pens to learn about new drugs. That matches my experience with my doctors; I often have to inform them of newer drugs, tests, or procedures. (They hate that. A lot.)

Expand full comment

Yeah, I'm much more in the "we stop teaching" camp. When you get to 22 or 26 or 29 or whatever, it's time to start doing stuff: creating value for yourself and others. Learning is hard, and most people can't really do learning and working at the same time.

I know Scott has an anti-school position, but I really think that "people learn less when they're not in school" is a total non-mystery.

Expand full comment

When people get good enough to do their jobs, they stop trying to improve and instead start profiting off their production.

Doctors exist (in the USA, at least) in a context of artificial scarcity. There’s so much work for them to do that they needn’t push their abilities to the limit to stay competitive.

Children with ADHD need to study hard enough to earn the approval of adults, and to perform well enough on tests to get by. They don’t actually need to improve their learning to achieve their goals, so they don’t.

Writers who can get a novel published might even actively avoid trying to get “better.” They’ve found a voice that works. Why mess with it?

I wouldn’t expect this embrace of a plateau to be a conscious choice in most cases. It just happens as people lean into the easy pleasure of enjoying their competency and reaping the rewards.

My guess is that if you studied only populations who were incentivized or driven to continuously improve over the long run, that you would indeed see continuous improvement.

Expand full comment

"Finally, I conflated two things in the previous section: a limit on how much you can learn total (eg the doctor who practices for many years) and a limit on how much you can learn per day (eg twenty words of Spanish vocabulary a day). I have no evidence for the latter except the testimony of one acquaintance, and maybe the corroborating evidence from the Ritalin study. Still, if there’s a maximum amount you can learn per day (or, more likely, a diminishing returns curve) that sounds useful to know, doesn’t it? Psych undergrads asking me for study ideas, here’s your chance!"

I'd suggest using Toki Pona as the study material, as most participants will not know it already. Also because the entire language only has 140 words.

https://devurandom.xyz/tokipona/

E.g. this course neatly packages this in 14 lessons with ten words each. Some of the words serve a grammatical function, though the grammar is pretty simple. Make it Chinese-speaking undergrads, then the grammar should not be an extra issue, as it'd be familiar to them.

Personally, I struggled to learn much more than 20 words myself. But I was also at a family gathering and this was more of a fun distraction.

Expand full comment
Aug 19, 2022·edited Aug 19, 2022

Other possible contributor: Specialists becoming Generalists

* a side effect of learning interference

* the contribution of Generalists is harder to quantify (generaly)

Even in the rare cases where stamina, ambition, motivation, drive, passion, and skills are held constant, a high performer will not compare well with the former self when becoming a manager, coach, teacher, mentor... or parent.

Like a sleak athletic feline becoming a cat herder.

Expand full comment

At least for me it is wildly easier to memorize a 9 or 12 or 15 digit series of numbers than a sentence of that length. Especially if the sentence has little content.

Expand full comment
Aug 19, 2022·edited Aug 19, 2022

I think there are a few more psychological phenomena worth considering:

1. Part of what makes you remember “where you were on 9/11” is that you’ve been reminded of this moment so many times. Thinking back to it causes you to recall it and store it anew; it’s akin to a repetition. The same has not happened for 9/12/2001, at least not in my life.

2. Another part of remembering 9/11 is the emotional charge, which amplifies rememberability. Maybe it didn’t have that charge originally (I remember wondering when I first heard “huh. I wonder how often that happens.”) but if you’re an American it has very likely developed a strong charge over time. I suspect this is what happened with “master of ceremonies” - you’re cursed to never forget that grievance until one day you meet the author and have your Inigo Montoya moment.

3. Our minds are not arbitrary data processors; they’re tuned to particular types of things. Learn 20 Spanish words, and you’re exercising only a very small subset of sounds we can distinguish. Learn 20 Mandarin words and you’re working with a significantly different set of sounds. (There’s probably also in-built machinery for code switching; it would be interesting to know if you could learn 20 polite words and 20 slang words, for example.)

But even bigger - you ask if remembering two six-digit numbers is as easy as remembering two interactions between people, and I think the answer is a resounding no. We have a lot of built-in machinery to deal with recognizing individuals, and much of our lives are built around fine-grained distinctions around actions. Our minds are built for remembering Chuck Norris shot a rocket at Marie Curie; they are not built for arbitrary sequences of numbers.

The most striking example I’ve encountered are the people who can memorize the order of a deck of cards after one go through. My understanding is that they’ve associated each card with a person or thing, and as they go through the deck they imagine going through a house and encountering each corresponding person or object in turn. This effectively stores arbitrary information (order of cards) in spatial memory, which seems to be much stronger.

Does this work? Well, if they can remember the order of the deck of cards and they couldn’t before, I’m inclined to say yes! But I want to be clear that I have not done even a semi-responsible amount of research.

Expand full comment

Could be a simple case of incentives. When you are young and hungry, you have greater motivation to work hard, prove yourself, gain respect of peers. When you are older, your priorities change and for many the job could be turning from a passion to a paycheck. The reversion to mean forces are likely very strong here, as more success you achieve earlier in life, the more difficult for you to keep showing same level of success and maintain same level of motivation. In short, natural ability may not be changing over time, only priorities and incentives to keep pushing the limits

Expand full comment

As someone diagnosed with ADHD, I’ll say that Ritalin doesn’t help one learn things, but it does help one do things.

That is, it’s useless for studying, but great for applying knowledge to a task for longer.

Take it for the test, not the lesson.

Expand full comment

We remember what we think about.

80% of Doctors don't go out and learn new stuff. Neither does 80% of anyone else. The first year of learning hits the 80/20 rule--you learn the 20% that account for the 80% of cases/experiences/symptoms. After that you hit some of what wasn't accounted for, but you never see them often. So by year 2, you've seen all of the cases you'll ever see twice. Everything else is a one off, and you cure those with whatever the a priori probability is they cure themselves.

Expand full comment

Skills also plateau when you stop finding new problems to apply your skills to. Some math problems can call back problem solving strategies and factoids you haven't used in a long time. Novel problems and novel opportunities to extend one's current knowledge base in new directions creates possibility for growth. In the learning ontology literature this is called a learning frontier within a learning space.

A learning space is a continuous network of items which can be learned without any skill or knowledge jumps between them.

If knowledge and skills were continuous then something like deliberate practice and a map of learning spaces and skill trees could always make for a better doctor or a better writer. But if knowledge sets and skills are not continuous, and instead require leaps of insight, or sudden changes in methodology in order for improvement then the persistent writer can plateau not only because they forget, but because their practice yields no insight into the underlying structure of language. Sometimes without the right frame there is no progress.

Expand full comment

"I will never forget where I was when I heard about 9-11, but I very much forget where I was on 9-12, 9-13, etc. The decay hypothesis doesn’t explain this."

Doesn't the decay hypothesis (partly) explain this? You thought about 9-11 much more frequently than 9-12 in the ensuring days, months, and years, so it was effectively spaced repetition learning.

Expand full comment

So by using Google Search heavily instead of memorizing the details, I can save my brain from interference?

Expand full comment

You don’t need the box brackets in Bayes’ equation, even without precedence rules, or have I missed something? Top blog regardless

Expand full comment

The article focusses on memory retention and the ability to answer exam type questions as a proxy for skills but are skills the same as being able to answer exam questions? And are there other plausible explanations to the observed data even if they are?

Firstly, the population of any profession might change quite a bit as it ages and that change might be influence the findings shown:

1. Women might leave the profession (for example to have children) more than men. The lower average scores of 60 year olds might be because men are slightly worse than women.

2. The best people might be promoted, leaving the remaining people still practicing worse than their promoted colleagues. In my field (software), many great engineers often become product managers or engineering leaders - are those left behind slightly worse on average?

Secondly, the exam questions might be a very poor proxy for skill. For example I am a fairly successful in business. I'm confident that I would do much worse in an MBA examination than someone who has just studied the course. Does that make me a worse at business or worse at solving contrived simple examples that academics think represent running a business?

Finally, knowledge might go off (particularly if it is irrelevant to practice in the actual field). Professionals tend to specialise. A qualified doctor specialising in lung problems might struggle to remember the examples found in exams about kidney disease. That doesn't mean they are worse doctors just not current in non-relevant parts of their field.

Standardised tests also might alter over time as the knowledge in an industry evolves, medicine must be particularly hard to keep up in. It wouldn't surprise me if older doctors appear worse partly because the received wisdom is also evolving particularly in areas that are no longer important to them.

My guess is that the article's conclusions are probably correct but I don't feel like the data and arguments used to support the conclusion stand up.

Expand full comment

Excellent article! As an economist by training, I’m reminded of the law of diminishing returns. As more investment goes into an effort, the gain on that additional investment is proportionally smaller.

Expand full comment

Thoughts generated by thinking too much about the language case:

- ok but actually learning two different but similar languages at once (e.g. Spanish and Portuguese) causes HELLA interference (plausibly more than just learning more words of a single language, because you have to track the extra bit "which language does this word belong to", as well as making many small distinctions between similar words)

- on the other hand there's also a countervailing mutually reinforcing effect - learning Spanish makes me better at Portuguese in some ways (e.g. I can kind of read Portuguese text without specifically knowing most of the words in it because I've seen very similar words in Spanish before)

- sometimes there are both effects at once (e.g. if I learn a word in Spanish I'm more likely to later understand a very similar word in Portuguese, but it might also make it harder for me to remember what exactly the Portuguese word looks like, or maybe even to remember the fact that that word actually does also exist in Portuguese

- wait, doesn't this mutually reinforcing effect also exist with nonlinguistic knowledge? e.g. if I have a fuller model of how human bodies work in general, it'll be easier to learn an additional fact about how human bodies work, compared to if I have a very sparse model, because I have more things to connect it to.

- so I think probably we also need an explanation of when related bits of knowledge interfere with each other and when they reinforce each other (and when they do both?)

Expand full comment

For creative fields, I think there's an additional helpful factor. For many authors their first work is some mix of interesting idea executed in an amateur-ish manner. As the author grows in skill, their perception of their own skill grows and they start thinking of writing their magnum opus - whatever idea they've had in the back of their mind, the story they would write when they got good enough. So they write that after achieving *just enough* skill, which leaves plenty of years afterwards for writing good stories that are not their one big idea.

The information I'm leaning on in this cment comes mostly from the livestreams of prolific fantasy author Brandon Sanderson, as he's talked about his own process - and others - in this aspect in livestreams.

Expand full comment

hah interesting, just wrote this so was able to reference this piece https://sagardubey.substack.com/p/common-concerns-of-wannabe-polyglots

Expand full comment