228 Comments
Comment deleted
Expand full comment

Please clarify what you mean by less agreeable(grumpy?) and why you think that it would correlate with intelligence.

Expand full comment

Less agreeable does not mean grumpy, it means literally less likely to agree with others, i.e. go along to get along, concede on matters, etc. One can be quite cheerfully disagreeable. Higher intelligence means more likely to think that one is right about things, come to one's own conclusions, and less likely to go along with what others say just because they say it.

Expand full comment

I asked what he meant by agreeable because, what you are describing above is not the dictionary definition of agreeable as I understand it. It is generally used either "as agreeable to x" or just "agreeable" where means pleasant. I think the latter is along the lines of he is agreeable to be around(where the agreeable one is the thing being agreed to).

I strongly disagree that higher intelligence equates to "more likely to think one is right about things." Haven't we had at least 1 post describing the opposite? Less-smart people think they are correct. More-smart people understand that things are complicated and they are better able to appreciate that they may not be. Everyone comes to their own conclusions.

Expand full comment

My understanding is that the Big Five Personality definition of the agreeableness trait pretty closely aligns with "goes along to get along."

Expand full comment

I think you are mixing up tendency to update one's priors and adjust to new facts (which would be correlated to intelligence) and what I was talking about in respect of "being right", which meant that one is not going to say that something is correct, when it obviously is not, just because of social pressure. I was talking about thinking one is right as a matter of sociability rather than being open to changing ones mind based on new information.

Example: you are sitting in a room with ten people and everyone is asked to come up with an answer to some question where there is an obviously correct versus incorrect answer on something factual (maybe a math problem). If the other nine people absolutely insist that the wrong answer is correct, and want you to go along with them, and you insist that your correct answer is right, you are probably not very agreeable.

Agreeable is referenced here in relation to the "Big Five" personality traits, which is what is shown in the table above, and it does indeed show that the cognitive genes are inversely related to agreeableness (i.e. more genes for cognition/IQ make you less likely to be agreeable). But the non-cog genes were positively correlated with agreeableness. The literature I've seen shows there is no strong association either way with IQ. Which is probably because the trait of "agreeableness" has a bunch of sub-traits which probably work at cross-purposes. "Compliance with social norms" is one. But so is empathy and compassion. And also sociability. Honestly, these are all pretty different things and perhaps should not be lumped together at all.

In any event, agreeableness should NOT be considered "grumpy", which would go more towards the neuroticism factor of the Big Five personality traits (i.e. moodiness and emotional instability).

Expand full comment

I was not aware of "agreeableness" or the "Big Five." That was why I had asked for clarification. On the "being right" issue, I suspect I just misunderstood you. I read being right as "being certain" and in my experience, one of the hallmarks of very bright people is the opposite.

There is a tendency to hedge: "if feels like", "is it possible that", "this reminds me of." This is perhaps increased awareness of ones innate(non-numerical) bayesian thing. Interestingly, this can also comes across as agreeableness(hedging) but at least some of it is a deeper awareness of the complexity of issues and an ability to acquire understanding without confidence in a solution.

Undoubtedly, High IQ people feel less need to prove themselves and are well aware of potential benefits of being agreeable, as well.

Expand full comment

Yeah, sorry, I wasn't very specific. Though you caused me to go look more at what constitutes "agreeableness" in reference to the Big Five personality traits, and it turns out that there is quite a bit of disagreement on this and it has many sub-components that don't seem very related. Empathy versus being submissive/easily conceding to the group versus stubbornness versus friendliness -- it's not clear to me whether these traits really have anything to do with each other whatsoever. So perhaps it isn't surprising that IQ doesn't show any strong correlation to agreeableness one way or the other. And it's fairly context-dependent anyway. I'm pretty far towards being extremely laid-back and easy-going in 90% of situations, EXCEPT in instances where I'm expected to agree with something I think is factually wrong, in which case I'm horribly obstinate and entirely disagreeable. That's probably not that unusual, and while I think that would measure as low neuroticism low agreeableness, I'm not sure.

Expand full comment

This has, naturally, been studied, and IQ has been found to have a modest positive correlation with agreeableness.

Expand full comment

I don't think this is true, most research I've seen shows no relationship of agreeableness with g. The chart from this study above also shows an inverse correlation with cognitive genes (but a positive correlation with non-cog genes).

Expand full comment

Academia filters out low intelligence people and lots of people want to be in academia. I don't see a situation where truly low intelligence people are the one's getting PhDs.

Expand full comment
Comment deleted
Expand full comment

What do you mean by low intelligence?

Expand full comment
Comment deleted
Expand full comment

I think your word choice is misleading, here, since you're trying to distinguish the "merely bright" from "geniuses"; describing the former as "low intelligence" is true in the relative sense, but the people you're thinking of are still above average intelligence for the general population

Expand full comment

An extremely hard working person who is obsessed with their subject but has an IQ of 105 can probably earn a PhD in most fields.

Expand full comment

I doubt that. In my experience, smart people overestimate how smart the average person is. (because smart people generally live and work with smarter than average people). Also, I know a few people who failed the qualifying exams in my PhD program in a natural science. They were all, every one of them, hard workers who loved the subject. They were also all much much smarter than 105 IQ. They just weren’t smart enough for the program.

Expand full comment

I wouldn't include hard sciences in my statement. There are a lot of easier fields.

Expand full comment

see here: https://www.iqcomparisonsite.com/occupations.aspx

Among college professors, 10th percentile = 97, 25th percentile = 104, 50th percentile = 115.

I'd expect college professors to be a bit higher than PhDs in the same field because there's an additional filter in the job application process. The average IQ of PhDs has probably declined over time as the number minted per year has increased ~20x since the 50s. I've heard elsewhere that the average IQ of PhDs was 125. That might be a much earlier sample. Or it might be the case that the easiest fields (such as grievance studies) have higher rates of PhDs becoming professors to spread the mind-virus instead of going into industry to do something useful, since there's nothing useful to do with bullshit fields. That could make the average of professors lower than the average of PhDs even if each field has above-average professors relative to the PhDs in the same field. (assuming almost all professors have PhDs)

Anyway, that was a long digression, but if the 25th percentile of college professors have an IQ of 104, it's totally plausible that people with very high noncognitive traits but IQ 105 could get PhDs in most fields.

Expand full comment

Point taken, and I’d agree with you if “most fields” can be taken to exclude STEM.

Expand full comment

> Depression is just bad. I strongly recommend not having it. Don’t even have any risk genes, if you can avoid it. All of you people trying to come up with clever evolutionary benefits for depression, I still think you’re wrong, and so does the genetic correlation with cognitive and non-cognitive aspects of educational attainment.

I am not a doctor or biologist, but the fact that rates of depression are heavily correlated with ethnic background (at least that's the stereotype; I haven't validated this), I would expect that depression is genetic in nature. And if it is genetic, I would assume that there is some kind of compensatory benefit to keep it around.

I don't know what the benefit is, and it seems pretty all-around shitty to me. But why would evolution specifically give you depression genes? There has to be some evolutionary benefit

Expand full comment
Comment deleted
Expand full comment

Those who don't want to be a burden on those around them could just leave, so they are no longer around them.

Expand full comment
author

Is it actually that correlated with ethnicity? https://www.ncbi.nlm.nih.gov/pmc/articles/PMC1199525/ (first study I Googled, I'm not claiming to have looked into this deeply) says it's 8% of whites, 9% of blacks, and 11% of Hispanics. That seems pretty similar - any differences could be accounted for by social deprivation or culture.

Expand full comment

Aren't suicide attempts, or even just suicides, not a better measure for this sort of comparison?

In my own searching (which admittedly was in a journalistic context rather than an academic one) I seem to remember finding that depression rates seem to suffer from a large number of effects that skew the rates of diagnoses, chief among them the fact that merely being the sort of person who deals with those sort of problems by going to get diagnosed instead of simply stewing in them makes someone a non-typical example. Suicide rates bypass this for the same reason murder rates are great for looking at crime: it's tough to ignore when a person is gone.

I don't know to what extent the study you linked can control for that, but looking at suicide data presents a very different picture, with there basically being a hard split between Whites and Amerindians on the one hand, and Blacks and Latinos on the other. The former group has suicide rates about double the latter. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8155821/

This doesn't necessarily show the effect to be genetic, but it certainly leaves it open as a possibility.

Expand full comment

Cultural confounders seem to be a pretty big issue here. I'm thinking specifically of Asian acceptance of suicide compared to Western disavowal of it. Breaking that down further, it looks like religion is a significant component of it. Black and Latino populations are known to be more religious (of the type that is anti-suicide) than White and especially Asian.

I don't know that this is necessarily true, but that's been my understanding.

Expand full comment

That's a reasonable point to make, though I struggle to think of what exactly the cultural co-founders could be, given:

- Asians have the lowest rates of religiosity, but also very low suicide rates, while Blacks have the highest rates of religiosity, and yet also low suicide rates

- The two races with the largest percent of single-parent households are Black and Amerindian, and yet these two lie on opposite ends of suicide rates

- The two races with the highest rate of gun access are White and Black, yet they lie on opposite ends of the spectrum as well.

- The two wealthiest races per capita are Asian and White, yet they are opposite in suicide rates. The two poorest are Amerindian and Black, which are once again polar opposites when it comes to suicide.

There might be some sort of cultural component, though I'm unsure what research has been done on that, but it isn't clearly evident from any factor that I would expect to influence suicidality

Expand full comment

I doubt it's the explanation, but I've read that Black children are actually more likely to have both parents than White children. We assume the opposite because Black children are less likely to have married parents, but Blacks often stay together without marrying. (As it happens, I read this the same year that I did taxes, and two of my clients were an unmarried Black couple with a child.)

Expand full comment

An interesting perspective that I'd never considered. If this is true, it would make for a fantastic example (for myself at least) of how sometimes basic statistics can mislead you from the reality on the ground.

Expand full comment

I'm not sure what you mean by kids being "more likely to have both parents" -- all kids have two parents, unless one or both are dead. But if you meant LIVING with both parents, though unmarried, this is not true. Black children who live with both parents, including both married and unmarried but cohabiting, is still less than 45%. For white kids that's 80% and for Asians, almost 90%. There's a huge racial discrepancy and MOST black children do not live with two parents, while most children of other races do. The portion of unmarried parents who live with their kids is about the same for all races (6-9%) except Asians, where it's unusual.

Expand full comment

strength of social networks come into mind. a more 'tight-knit' black family may provide pressures to prevent people from suicide if you have a functional-ih family structure.

Expand full comment

Some economist should find a natural experiment and study whether or not converting to X religion or being adopted by X culture actually has a protective effect against suicide (or increases income, or increases life expectancy, or increases life satisfaction, or whatever).

Expand full comment

Could Japan being counted as part of Asia be confounding the Asian numbers? They have a history and tradition of ritual suicide for various reasons as part of their culture (less often in modern times but still a thing). My intuition is that people with fewer existential problems (hunger, thirst, shelter, power etc) tend to be less able to cope with the smaller problems that remain, which means I would expect whites to be "winning" in suicide rates while everyone else is lower but still around the same order of magnitude.

Expand full comment

The rituals and traditions for suicide in Japan were related to behave in war though. The banzai charge/gyokusai attacks.

Expand full comment

There are likely different confounders there - people are more likely to commit suicide when they have the easy means to do so, which probably varies a lot by culture, region, wealth, etc.

Expand full comment
author

No because suicide isn't a good proxy for depression. It has lots of non-depression causes (eg alcoholism, schizophrenia, impulsive action after one really bad day) and lots of non-depression preventors (eg religious people are much less likely to commit suicide, maybe because of fear of afterlife)

Expand full comment

While I understand that those make for issues with using suicide as a proxy in general, are they truly an issue for this particular case (racial comparison)?

Schizophrenia has decent evidence of being race-neutral, rates of alcoholism aren't so starkly different between races to account for anywhere near the gap (except perhaps among Amerindians? I am failing to find any decently reliable dateset that includes them in alcoholism data,) and unless white people are comparatively incredibly impulsive, impulse control wouldn't be a differentiating factor.

And as far as preventors, see my response to Mr. Doolittle above. Races with similar rates of religiosity fall on opposite sides of the suicide spectrum. The same goes for income, educational attainment, gun access, absentee parents, etc.

As far as I can tell, considering these factors only make the racial separation even more stark.

Expand full comment

Not all religions are the same, and definitely not interchangeable. Western religions (Jewish, Christian, and also Muslim) have a very negative view of suicide, which I don't think exists for Eastern religions. Japan is well known to have very different views of suicide than in much of the rest of the world.

Expand full comment

Just ask Hamlet about suicide...

To die, to sleep;

To sleep, perchance to dream—ay, there's the rub:

For in that sleep of death what dreams may come,

When we have shuffled off this mortal coil,

Must give us pause.

Expand full comment

I bet depression and suicides are correlated with more free time (being less busy), more of the basic human needs covered (food, shelter, sex, security) and less close relationships. And it's not only depression but even health in general and life expectancy.

Expand full comment

Didn't Emile Durkeim mention how protestant regions of Europe had a higher rate of suicide than Catholic areas as well as how industrial or urban the area was?

Expand full comment

Well, I'm only a layman, but here is a possible benefit:

Don't mildly depressed people have a slightly more realistic view of reality compared to the rest of us?

So let's say you are in a historical environment and you believe something maladaptive that you derive negative utility from but is not so bad that you aren't dead.

This over time depresses your mood, enables you to see reality better and gives you a chance to change your mind.

Depression acting as a belief/behaviour switching function.

Historically the lack of motivation wouldn't have been to much of an issue as you are cold, hungry or horny enough to push past it. The lethargy may even be good in giving you slightly less caloric upkeep while you figure out what you are doing wrong.

Of course, if depression gets too deep, that's a failure case. But hey, we know evolution only aims for "barely good enough".

Expand full comment

Depression is the failure case. Being sad because things are going wrong is just normal behaviour.

Expand full comment

So its only labelled depression when the results are bad? In that case by definition depression is always bad as Scott said. I always imagined it as a spectrum sort of thing rather than an on/off switch.

Your right I should have used the term sadness rather than depression if depression is the technical term for the failure case.

I kinda see them as the same thing though differing by degree?

Expand full comment

That something must be bad to be considered a psychiatric disorder, I believe is true of all of the psychiatric disorders. Pretty much always 1 of the DSM requirements is that it interferes with your quality of life.

From nih.gov for "Major Depressive Episode":

- Symptoms must cause significant distress or impairment

Expand full comment

I think this labeling issue is at the heart of the problem Scott's trying to get at. If you only label bad cases, then all cases are bad. If you label adaptive cases that don't reach the level of bad (we call it "sadness") in the same category, we might be able to say something meaningful beyond "depression is always bad."

Expand full comment

As someone with lots of experience with depression, for me it's not about having a realistic view of reality; it's not caring about reality at all. When I'm having an episode, it's impossible to imagine anything being interesting. It's all just boring.

Expand full comment

>Don't mildly depressed people have a slightly more realistic view of reality compared to the rest of us?

I think perhaps you mean pessimistic people, not people with clinical depression.

That being said, experts who are optimists tend to make significantly more accurate (20% more) political predictions than pessimists (Expert Political Judgment : How good is it, Philip E. Tetlock 2005).

Expand full comment

Somebody did a study with a box with lights and buttons, and asked people to estimate how much their button presses influenced the lights. Most people overestimated, but depressed people tended to get it right.

I think maybe I read about this study not replicating? But I've read about so many things not replicating I might have gotten mixed up.

Expand full comment

Key question here though: by depressed people do you mean people at that moment suffering from depression or just people at some point diagnosed with it? Because my experience of suffering from depression is that your ability to analyse and observe is markedly different depending on whether you are suffering an episode or not.

Expand full comment

Pain is just bad. I recommend not having it. Yet its absence is maladaptive.

Abraham Lincoln seems to have had depression. I suspect that high IQ + depression (to a degree that isn't disabling) might be more suited to navigating times of strife than high IQ + alternative, particularly for leaders/decision makers.

What's the alternative mental state during lengthy, difficult circumstances? Cold indifference? Cheerfulness? Better to err being in a depressed state than one of emotional deadness or irrational glee.

Expand full comment

Sadness under circumstances where it makes sense to be sad is not depression. Depression is constant, hopeless sadness for no reason.

Expand full comment

No there doesn't. Natural selection works on the entire genome at once, it doesn't individually optimize each and every gene. (If nothing else, the space is so highly dimensional it would need a huge amount of time to do that.) It's completely possible for natural selection to optimize foo (which confers a major survival benefit) which alas brings along bar (which confers a mild disability) for the ride, because the two genes happen to be hooked up in the DNA world in any of the various ways that can happen.

Expand full comment

There seems to be an obvious benefit to seasonal depression at least. In winter:

* there's little food to eat

* it's cold outside

* meeting strangers will get you ill easily

* there's no work to be done on the fields

Therefore if you're naturaly inclined to

* eat little

* stay inside and not meet anybody

* sleep all day

…you more likely to survive winter. Especially if you live in the middle ages or earlier.

Note that none of these conditions will get you a smart brain or a high function level, but that's not the point of survival.

Expand full comment

You do realise that agricultural societies work throughout winter? It was the only time that was available for non-routine work, although there were annual activities as well. Winter crops, preparation for early spring planting, finding fuel, making repairs, clearing drainage, pickling and brewing, digging out the sheep, taking in new land... Not doing these things (often communal activities) screws up your chances of prospering and therefore surviving/having your children survive.

Also, the positive genetic argument for depression has to deal with the minor issue that it generally removed your sex drive, which is an evolutionary fail.

Expand full comment

Are human beings, biologically speaking, more a product agricultural society or of pre-agricultural society? I would presume the latter, and so to the extent that depression is biological I could see how it could make sense evolutionarily.

Expand full comment

Keep in mind, not all evolutionary benefits remain beneficial in such a different context as the modern world. An obvious example is food, we have access to more calories than we know what to do with and adaptations to save calories wouldn't be advantageous today.

It could be that depression used to have an evolutionary benefit but somewhere between hunter-gatherer and flappy bird it lost that advantage and just became purely negative.

Expand full comment

Title on the graph seems wrong, "EA FDR correction tries to impute the results from the main timeline, where Roosevelt was an effective altruist and diverted the resources of the Depression-era US into curing all diseases."

Expand full comment

It's a joke about the key for the grey line.

Expand full comment

But I also initally thought Scott had made a mistake! (some of us read the text underneath to figure out what on earth we're looking at before reading all the acryomns in the very dense picture!)

Expand full comment

It’s also a play on the bad leading in their figure legend — EA and lines below it (incl. FDR) are separate labels.

Expand full comment

I feel like the evolutionary arguments for intellect seem to be getting weaker with each post. I think it might have mattered for a given time for humans in terms of surviving, but... then things which seem to be evolutionary vestigial at best seem to be vaguely correlated with things we denote as success, it does seem a bit... conflated with a lot of concepts. Like, for example, we have the well known IQ/Fertility inverse. We also seem to be unable to generalize g meaningfully across a variety of species [1]. Even if there have been studies regarding things like other primates and rats, to suggest g could be found in dolphins, elephants, birds, or the proud nematode seems... difficult to ask, unless one necessarily adopts a human centric model of cognition. There is also a somewhat iffy question when it comes to talking about forms of human organization over the years and the role of intellect, and how important was in various societies. The original psychometricians hypothesized humanity started being dysgenic in the developed world prior to their first studies, which... well, suggests a relatively low bound for humans today where IQ would give one an *evolutionarily*, not necessarily a societal benefit. ADHD and Depression, the two you mentioned as being effectively evolutionarily irrelevant, have some of the highest frequencies within the human population. Especially with a lot of the research today, I feel like intelligence could be possibly described not as an evolutionary benefit, but a human society beneficial "quirk".

[1]: How General is Cognitive Abillity in Non-Human Animals? A meta-analytical and multi-level reanalysis approach. Porier, Marc-Antonine, et. al. https://royalsocietypublishing.org/doi/10.1098/rspb.2020.1853

Expand full comment
author

I think you can just check and intelligence is being positively selected for over evolutionary time scales. See eg https://www.nature.com/articles/s41598-018-30387-9 (random study, haven't done a deep dive but I believe it's representative).

This doesn't mean things haven't reversed in the past few centuries, but that probably hasn't had too much of an effect yet.

Expand full comment

This gets kind of easily... much more complicated than I am indeed read on the matter. Yes, that study suggests that evolution was selected for over around 500000-750000 years in humans. This... well, is a bit odd, partially because relative to some other evolutionary time frames, this is a small time frame. But, again, I'm not an expert in much of this. There are more than a couple of paradoxes regarding intellect which say equivalent things. There is also this paper on "Astrobiology". https://www.liebertpub.com/doi/10.1089/ast.2019.2149 So... I don't know. We have a hard enough time measuring g in animals which are further away from humans (does anyone want to measure the g of a whale or dolphin or octopus, all of whom display cephalization?), so it seems the research here is still overall unsupportive of g being important in anything *but* humans. If I am wrong, you should see other species select for cognitive ability, and evolution select against those species which do not. This does not appear to me to be an obvious conclusion in the least, given we still have lots of phytoplankton and jellyfish roaming around.

Expand full comment

Evolution wouldn't necessarily select for or against cognitive ability. It would depend on the selection pressures of the environment. Perhaps humans were in a unique situation. For example, they could have been a species which could use their brain because they have hands to manipulate things with. A fish might not need the intelligence because it could be metabolically expensive and not afford much benefit.

Expand full comment

I dunno. The more we're thinking about this, the more I realize that this is probably worthy of a fair amount of research into. Neanderthals went extinct 40,000 years ago, which is absolute fucking peanuts on an evolutionary time scale. You could easily make arguments that either the speciation period from Neanderthals to humans was a bit of a punctuated equilibrium, and humans are unique for selecting for intellect in that time period, or that the momentary analyses of "dysgenics" are the wrong headed analyses due to their recency and modern society. Human biomass did increase relative to everything else in the last few centuries as well, indicating a "selection for intelligence" (assuming humans are the most intelligent), but before, this was not the case at all, which is when intelligence *was* being selected for among humans in particular.

Expand full comment

Maybe I'm misreading your argument, but neanderthals and other archaic humans almost certainly selected for intellect as well. Neanderthals had rudimentary art, decoration, funerary practices, almost certainly language, etc. which are all things they developed extremely quickly on an evolutionary time scale compared to their ancestors.

It seems likely to me that the conditions for massively accelerating intelligence are an environment and a species toolkit that makes each marginal improvement in intelligence create outsized gains in reproductive probability. The climate and environment of the archaic past was much more hostile than the one we have today, and with even very basic technical advancements (the use of specialized stones rather than general hand axes, for example) the odds of survival shoot up in that environment.

The human species is no longer facing those conditions, so selection for intelligence has slowed down. Our growing biomass is just the outworking of a process that has more or less been pre-determined since we discovered writing and gained the ability to greatly increase information transmission.

Expand full comment

There are still implicit hypotheses here I would very strongly advocate for testing of! Are selection effects for intellect weaker in other tool using species? Like, I think monkeys and elephants might not benefit enormously from such things or invent agriculture, but you would expect monkeys who know how to use rocks to get into a coconut to have a higher reproductive success than those who die not knowing how to get into the meat of a coconut using a rock. Also, similarly, before/after IQ tests of individuals during a pandemic. If one was... again, *terrible* idea ethically and I would not do this under any circumstances, particularly intrigued into this, you would expect the cohort of those affected by coof and other selective pressures (i.e. old people) to show higher genetic IQ as they get older.

Expand full comment

You need to think in terms of ecological niches. Organisms (plants and animals) evolve to fill each available ecological niche in a given environment/ecosystem. For some of those niches, like say pack hunting ruminants for protein and fat, intelligence will be an asset. For others, like turning sunlight into plant food, it won't be. This is what determines if it gets selected for.

Expand full comment

I mean, again, this... is weird. I would say, for example, intellect would be a factor in whether one is the leader of a given pack of hunters for hunting something, but it is probably much more of a factor in whether one programs computers. Yet, intelligence has been selected slightly against in contemporary society, whereas the former society *did* select for intellect. This gets even weirder when you consider humans mold their ecological biomes literally all the time--so it could have something of a dynamical systems flavor to it, where an equilibria of "intelligent individuals" is reached. Gwern also discussed something along the lines of a "biological maximum intelligence for humans" when discussing studies of genius. But, again, this is a kind of trees view of my point. Intellect *was* an evolutionary advantage for *human individuals* undergoing competition, which, ironically, would limit the scope of "intellect" to humans. This would, understandably, limit the scope of saying it is an "evolutionary advantage", as I think most plants are existing just fine.

Expand full comment

I agree, it does seem odd that there would be more selection for intelligence among hunter gatherers than in the past few hundred years, when the economic and social returns available to people of above average intelligence seem so obvious. Human social dynamics are complex, though, so it could be any number of factors.

Expand full comment

I feel like selection in general is weaker than it used to be, because almost everyone survives to adulthood, and most people have kids. Maybe the sweet spot for low selection was in the decades after 1945; few young deaths, and also few childless weirdos like me.

Expand full comment

And see, this is just *one part* of the weirdness (Apologies for all the asterisks). I once heard someone say it would be better modeled as a sort of resource expansion allowing high reproduction individuals/low survival individuals to have a higher fitness than their low reproduction/high survival peers, but even then that (given human ways of interacting with the environment) leads to possible evolutionary advantage for the species as a whole, but kinda flips the infraspecific competitive advantage on its head. I feel like the social dynamics of such scenarios have been done to death, but this is besides the crucial question, which is, what exactly is the "natural resting state" for whatever smart alleles we've identified or haven't identified.

Expand full comment

Come. Are you seriously debating whether horses are smarter than houseflies? The fact that it's hard to judge close calls does not in the least call into question the capability to judge *at all*.

Expand full comment

Well, again, the big epistemological worm I have running around in my head is that we (as humans) defined intellect as being able to do human things. Why are primates smarter than, say, an elephant, or a giraffe, or a hummingbird? Because it can do complex material constructions which are similar to the ones humans make. Now, this gets even weirder, when you consider that the tentative neurological definitions (or rather, correlates) of intellect are... well, kind of unintuitive. We know something regarding the structure of certain neurons has something to do with VIQ in humans and cortical thickness of a certain brain region has something to do with PIQ in humans. It also (tends to) correlate with things like reaction speed. Given that hummingbirds are really fast, squishy small things that do material constructs, and elephants are slow, big things that also do similar constructs, which is smarter than the other? This also gets into weird stuff--macrocephalic people are not very smart, but head sizes of normal people are somewhat correlated with IQ. Houseflies appear to react *faster* than horses, at least.

Expand full comment

We can easily observe intelligence in animals, because we define it as "solving problems." They don't have to be problems that are interesting to humans, and indeed they are typically not, they are problems interesting to the animals, like finding food or how to get out of or into some space with barriers, et cetera.

Anyone who has owned animals knows that some are clearly smarter than others. Some dogs are smart, some are dumb, and any dog-owner can tell that. Some dogs figure out how to beg or steal food cleverly, some are dumb about it, some never figure it out. You can fool some dogs easily, some are much much harder to trick.

Some horses are smarter than others, e.g. can figure out how a latch works, or when a gate is likely to be left open, and someone who works with horses can see that easily enough. Heck, I've seen someone who kept rats as a pet observe that some rats were clearly smarter than others because they could solve problems interesting to rats -- finding food, shelter, mates, whatever.

It feels to me like you're getting lost in mechanistic details, and overlooking the basic crude observational data, which is that we can clearly see that individual members of a species can solve problems important *to that species* at differentiated rates of success, over time. We can also observe that one species can solve problems interesting to that species, on average, better than another, so we can readily conclude one species is smarter than another.

I'm sure it's much harder to distinguish fine gradiations of intelligence of dogs than it is of humans, because we know much less about what's important to a dog, and how dogs think. But that is, if you will, merely implementation difficulties, a challenge to experimental or apparatus design, it doesn't call into question the entire existence of the trait.

Expand full comment

This is kind of a Moorean response to the question, which, while valid, doesn't really provide any operational definition. If a spider spins a more convoluted web, is it smarter or dumber than a spider which spins a simpler web? What if the former catches more food than they need while the other catches just what they need and both survive and have spider babies? And, I want to point out, you chose examples of domesticated mammals. What percentile is Koko among all the other famous signing Gorillas?

Expand full comment

Yeah I think you are still favoring the types of problems that people think are impressive. Insects are great at surviving and solving problems in ways that humans are totally uninterested in or actively hate.

Virtually every species that humans consider to be particularly intelligent is one much like us, i.e. a social-group living species that hunts or is at least omnivorous. Dogs, dolphins, whales, elephants, primates, wolves, etc. So either living in a socially cooperative manner confers greater intelligence (which would make sense, because having to consider not just your own interests but also how the group will react to your behavior is a lot of cognitive load), or we just tend to be particularly impressed with forms of intelligence that we can recognize and that serve our interests.

I take your point in that I've had pets that were clearly smarter or dumber than others. But it's also hard to say in some cases. People tend to consider dogs smart when the dog is easily trainable and likely to do what the human wants. I'd say cats are smarter than dogs but people tend to think dogs are smarter because they're submissive to humans and want to please them. One of my dogs learns new tricks at literally about 200 times the speed of the other -- it's a stark difference -- and is much more attuned to people and how to manipulate them. But the stubborn/"dumb" one has far superior survival skills with respect to being appropriately suspicious of danger, knowing when to hide or approach, hunting skills, etc. If they weren't living with and dependent on humans, I would expect the "dumb" one to have far greater survival chances and the smart one to be dead within a month.

Expand full comment

Well, there are two possibilities here:

(1) Intelligence confers a survival advantage to humans, and natural selection has tended toward optimizing it.

(2) Intelligence does not confer a survival advantage to humans, and the fact that humans -- every one of them -- are about umpty times smarter than dogs or rabbits is just a weird coincidence.

The problem with (2) is that it asks us to believe that *the* most salient characteristic of a species is mere accident. It's like observing that cheetahs are really, really fast in a sprint, but this is just an accident, and their survival as a species actually derives entirely from the clever design of their spots.

Expand full comment

See my "dynamical systems" post above. And, again, I think this is somewhat of a wrong headed view of evolution. The mutations themselves (this particular mutation self-selecting itself especially) is a bit of a departure from normal evolutionary form. Consider that a lot of evolutionary processes took *millions* of years to occur. Considering most primates still use ancient tools and get along (as species) just fine, and most mammals have hooves which make the use of tools somewhat irrelevant, the fact that formerly non-sentient monkeys went to making art and writing in the span of less than a million years is certainly... weird. But, again, I don't know. Contemporary humans today are probably more "evolutionarily fit" than humans centuries ago simply because we occupy more biomass than they probably did as a ratio to the earth's biomass, but we also don't select for intellect anymore? Note that if the human population declines and other species' biomasses don't decline, then humans would be *less* evolutionarily fit than now.

Expand full comment

In terms of very recent evolution, there's reason to think western europe was eugenic between 14th-20th centuries because so many violent criminals were being executed and the upper classes were having relatively more kids than the lower classes. Not an exact measure of changes in intelligence but its something.

https://journals.sagepub.com/doi/full/10.1177/147470491501300114

Expand full comment

As for the hypothesis here, you would then expect any circumstances which had an undue burden on lower class people to like, change the IQ of a given population before and after. You know, like a pandemic. Not sure if this would be looked at, partially because anyone who did would be killed no matter the conclusion. I'm not even sure if that would be the incorrect response to such an idea, but it's there.

Expand full comment

This essay is very interesting, and it led me down a fascinating rabbit hole.

However, it honestly seems like his claims about the genetic effects of capital punishment are wildly overstated, by a factor of ten.

I was very shocked by his claim that men in Late Medieval/Early Modern Europe had a 0.5-1% lifetime chance of being executed, due to an annual execution rate of 1 execution per 10,000 people. When I looked up the source he cites for this claim (L'Occident est Nu, by Philip Taccoen) on Google Books, I discovered that he really does appear to have simply misquoted Taccoen: the latter says that Early Modern England and Malines, as well as 18th-c. France, executed one out of every *hundred* thousand people annually.

(The other source this paper cites for the execution rates claim, Paul Savey-Casart, isn't an independent source but simply the one cited by Tacoen as the source of this claim. Based on the lack of a page number given for the Savey-Casart citation, I strongly doubt that the authors of the paper tracked down Savey-Casart's book themselves.)

It's entirely possible that I'm totally wrong and there's some obvious issue I'm missing--but it really does seem like they founded a significant part of their thesis on a claim that could have been disproven by simply double-checking what their source actually said.

Expand full comment

I guess I’m confused why generalizability to other species has any relevance on g as it applies to humans? It seems fairly obvious that intelligence has been strongly selected for (how else could we get so smart).

A point that it seems you’re trying to make is that humans aren’t any more intelligent than other species, we all fill our niches equally well. A bee could look at a human with a PhD and think them unintelligent because they don’t know how to build a beehive. The part that I think this view misses is that bees (and all other non-human animals) only know how to do one thing. Each species of bee is evolved to build one type of hive and produce one type of honey and collect nectar from one ecosystem’s flowers. Humans on the other hand have been able to spread over the globe over the last ~40,000 years into every possible ecosystem and have adapted - quickly and with our intelligence, not with slowly natural selection - and it is that capability that we use to justify our intelligence relative to other species.

Expand full comment

https://evolution.berkeley.edu/evolibrary/article/side_0_0/punctuated_01

So, OK. Humans got smart in a pretty short time, evolutionarily speaking. If *intellect* is an *evolutionary advantage*, full stop, we would see evidence of this in at least some other species than humans. Also, rat IQ tests would be practical for measuring genetic drift of... rat IQ alleles over generations.

Expand full comment

I don’t think anyone makes the claim that intellect is always a positive trait, as people have mentioned elsewhere in this thread it involves a heavy trade off in between general-purpose intelligence and energy usage, so in most niches it is not something that is especially selected for.

Rather, I think that intelligence is something that became useful in human’s particular niche (perhaps due to our ability to learn socially) and then rapidly evolved to fill that niche

Expand full comment

I mean, again, I am sympathetic to this line of thinking, but it doesn't answer the "why was there evolutionary pressure for us to be smart in the first place?" More seriously, what is the evolutionary purpose of an intelligence which brought us to a given standard of living, just to kill itself in terms of frequency among its species?

Expand full comment

I'm not sure what you mean by the second part, there is no purpose of intelligence, its that more intelligent people outsurvived and outreproduced less intelligent people, causing the average intelligence of the population to rise. What is left to explain?

Expand full comment

A lot, actually. The centuries long phenotypic reversal of this trend. Directions of research going forward. You know, clearly defining that thing we enjoy and like so much on at least some form of interspecial level might aid us in constructing or understanding intelligences we may or may not build. Endgame is probably a mechanical model of what we mean by intellect.

Expand full comment

"As of last time I checked, the leading hypothesis was that schizophrenia genes were just really bad, evolutionary detritus that we hadn’t quite managed to weed out."

Is it really? I thought the idea that schizophrenia genes/low levels of schizotypy were somewhat positively associated with creativity (when they don't result into full-blown schizophrenia, which does decrease creativity) was rather widespread. I wouldn't necessarily have expected creativity to correlate with educational attainment, but it's clearly a positive trait that is easy to think would be selected for in many environments - not quite what you'd expect from "evolutionary detritus". Eg https://onlinelibrary.wiley.com/doi/abs/10.1111/j.1601-5223.1970.tb02343.x or https://www.nature.com/articles/nn.4040 but a Google Scholar search would return much more

Expand full comment

As I posted in my comment, Robert Sapolsky discusses this idea at length here: https://youtu.be/4WwAQqWUkpI

Awesome for the overview and you can listen while doing something else, too.

Expand full comment

I have heard the same thing. I don't have any cool links to share on the subject, though. :(

Expand full comment

I'm not familiar with this type of research at all. When one says "found genes for intelligence" or "found genes for educational attainment", what does that mean? Is the claim that some portion or portions of the human genome have been identified that, when they look like 'x', 'y', or 'z', make a person unintelligent, of average intelligence, or very intelligent, respectively?

Expand full comment

Suppose you have 100,000 genetic tests and you know the educational attainment of each individual. If a sufficiently large number randomly selected sample should have approximately the same average educational attainment. If instead of selecting randomly, you select by presence of a specific gene, and if educational attainment varies between the two groups, you have a correlation. Repeat for next gene.... (Where a gene is a DNA sequence that is, in its entirety, passed on by inheritance)

Expand full comment

Google tells me that humans have between 20-25k genes. That seems like a lot of free parameters, especially when you start to consider combinations of different genes. I'm not trying to dismiss a field I know nothing about out of hand, but I worry about chance correlations with a parameter space that large. What's known about function of these genes? Is there any reason to suspect they would influence cognition? Do different studies using different test populations tend to find correlations between intelligence (or whatever trait you are interested in) and roughly the same set of genes?

Expand full comment

Only 11 or 22 intelligence-related genes seems *really* low out of a pool of 20-25k.

Expand full comment

A typical study might be looking at 2.5 million SNPs. So if they didn't know statistics and just checked for p<.05, we'd expect them to get on the order of 100k false positives.

Expand full comment

"but I worry about chance correlations with a parameter space that large."

It was a huge problem when using small (a few hundreds) sample sizes and a small set of candidate genes, leading to many many false positive "discovery" during the nineties and the aughts. But now the studies use sample sizes of several hundreds thousands people, leading to much higher reliability, and yes, the snp (=position within the genome)/traits associations are replicable, e.g.

https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6003860/.

("First, GWAS findings are highly replicable. This is an unprecedented phenomenon in complex trait genetics, and indeed in many areas of science, which in past decades had been plagued by false positives. ")

Expand full comment

One single variant wouldn't make a person unintelligent, average or very intelligent but with enough data we can start to say that one single gene variant is either positively or negatively associated with intelligence.

Expand full comment

Nice summary of the current state of research here in the link below. I really appreciate the fact that they listed the major controversies about genes and intelligence without commenting on them — but just giving links to footnotes. The fact that they admit there's controversy is big step forward if you ask me. Lol! They also have a section on Parieto-Frontal Integration Theory (P-FIT), which seems to be 21st-Century phrenology with brain scans. I guess I shouldn't be snarky P-FIT, but when one studies the history of science one sees recurrent meta-theories arise over and over again.

https://www.nature.com/articles/s41380-021-01027-y#:~:text=A%20positive%20genetic%20correlation%20indicates,likelihood%20of%20developing%20the%20disorder.

This next paper claims to have identified 187 loci that have a role for neurogenesis and myelination that seem to affect intelligence. Pretty dense (or maybe my myelination quotient isn't high enough to immediately grasp it).

https://www.nature.com/articles/s41380-017-0001-5

This third paper finds that there are 22 genes that seem to affect IQ. Can't figure out how to line up their findings with loci described above.

https://www.nature.com/articles/ng.3869.epdf

There was another paper I read a few months back that insisted that there were only 11 genes that are strongly correlated with IQ. But now I can't find it. I would have been interested to see if any of the 11 overlap with the 22 paper.

Expand full comment

"Both cognitive and non-cognitive skills are correlated with neuroticism, which I guess makes sense - that’s probably what makes you do your homework on time."

They're both negatively correlated, though, right? So this also means that NonCog has correlation in the "good" direction for each of the Big Five, which makes sense.

Also, I'm trying to think of how to recreate this analysis, but without looking at genes. I guess it would mean that for a given educational attainment (say, college grads), people in the bottom 10% of IQ are more likely to be schizophrenic than the average college grad? Or at least have more relatives with schizophrenia?

Expand full comment
author

You're right, thank you, fixed.

Expand full comment

The negative correlation with neuroticism makes sense to me in that on the Big Five, neuroticism means more like "emotional reactivity" than like "obsessive."

Expand full comment

I'd like to see some exploration of the ADHD–IQ link. To me it seems plausible that ADHD genes might affect intelligence test scores without actually affecting underlying intelligence. This could be investigated by looking at the results of the subtests: if ADHD genes have an equal effect on subtests that require attention (such as digit span) as on subtests that do not (such as vocabulary), that would be evidence against my hypothesis.

Expand full comment

I think that sort of perspective on intelligence highlights the fundamental problems of this research - the bedrock for “smart” are subjective.

For example, there’s a very famous story about a world class ballerina (use Google for the specifics) who was terrible in school as a child, so her mother took her to see a psychologist. He watched her and asked the mother what she enjoyed most and what she did in her free time. The mother said she likes to dance so the psychologists simply said then that’s what she finds most interesting - if she can’t pay attention to arithmetic, she’ll probably pay attention to dance routines. And she went on to fall in love with dance and become a world class dancer.

In today’s world, she probably would have been given ADHD medication and sent on her way. She might have thought she was dumb all her life, but it’s impossible to deny that she was a genius in a kinesthetic sense. Her underlying “ADHD” was not a cognitive benefit in the classroom, but was in the ballroom. This genetic analysis does not assess that sort of professional assessment in correlation with IQ.

I also seem to recall that Scott has a pet theory that a lot of ED docs have ADHD and thinks that might benefit them in their environment. But I wonder how their genes would fall on this type of table? Are they simply outliers who take medication or somehow overcame their ADHD to achieve a high level of academic and professional success anyway?

Expand full comment

Why should dancing be considered a form of intelligence though? I'm not sure that is what people think of when they think about being smart or dumb.

I'm not sure how much we can say that the ADHD was actually a benefit in the ballroom. Perhaps, she could not have ADHD and also be a good dancer. By what means does ADHD make her move around more elegantly?

Expand full comment

My question is why shouldn’t kinesthetic ability or other things like artistic genius or language acquisition ability be included in “intelligence”. It’s somewhat of a subjective umbrella to begin with. That’s my point.

The point is made below that ADHD might manifest as a latent desire for some specific things like music, dance, math, etc. and so that may imbue a certain drive for talent in that domain but also make for sizable distraction from others.

Expand full comment

I'd say the reason digit span and vocabulary are both included in IQ tests is that there is evidence that they are affected by a common factor, which we tend to refer to as intelligence. I am not aware of any evidence that ability to dance has any strong link to that same common factor, and, might I add, having seen a fair number of intelligent people dance, I have not noticed such a correlation.

Expand full comment

"having seen a fair number of intelligent people dance, I have not noticed such a correlation." nicely said!

Expand full comment

i think this read is loosely related, about the seven arts, aka skills to study and education oneself on https://www.delanceyplace.com/view-archives.php?4504

Expand full comment

In my experience ADHD mostly hurts your ability to concentrate on subjects or tasks you don’t find intrinsically interesting or when there is a competing topic to think about that is more interesting than your current task. This is why many people with ADHD end up with grades that differ wildly between academic subjects. If someone with ADHD finds every or almost every relevant subject in undergrad and medical school fascinating and doesn’t have a lot of competing interests, they can do very well.

Expand full comment

Meant to reply to Hippo’s comment. Sorry about that.

Expand full comment

Yeah that’s a good point!

Expand full comment

Anecdotes are not data, but.....

I can attest to this. I was recently diagnosed with ADHD, but haven't taken medication so far. However, I was one of the more obvious cases since childhood.

I fully recognize myself as a completely different caliber of human when I am in focus/attentive vs when I am not. I also frequently take around 30 minutes to enter a focus state (the sensory deprivation of an exam hall helps, but the unfamiliarity of environment takes a few minutes to adjust)

It is incredibly irritating, because I would often be the person that helps friends prepare for examinations and help them understand hard concepts. But, when it came to the exam hall, I felt like my eyes & brain was dilated for the first 30 minutes. Then I'd slowly come back to the earth and rush through things as I started 'getting it', but never ended up reaching the end.

I feel like a cat sometimes. Giving off the appearance of a lazy organism with the intellect of a rock, until you actually need to be productive.

> hurts your ability to concentrate on subjects or tasks you don’t find intrinsically interesting or when there is a competing topic to think about that is more interesting than your current task.

Yep. I might as well use this as my elevator pitch,

> ADHD end up with grades that differ wildly between academic subjects

100%. I used to get Ds in classes that needed rote learning, including biology, but physics, math and english comprehension/essays came to me really easily.

> If someone with ADHD finds every or almost every relevant subject fascinating

Happened to me with Machine Learning. Went from a lazy bum to someone who was reading papers and textbooks for fun. Ofc, this was until hackernews and reddit pulled me back in. :|

Expand full comment

"In my experience ADHD mostly hurts your ability to concentrate on subjects or tasks you don’t find intrinsically interesting or when there is a competing topic to think about that is more interesting than your current task. "

Very interesting, thank you!

Expand full comment

The fact is there's very very few jobs you'll get by being a good dancer, and lots of jobs you'll get by being good at mathematical/logical reasoning. So it can perfectly well make sense to force kids to do the regular school routine -- if they can at least convincingly do it they might have a much better shot than if they try to become a dancer/footballer/whatever else.

Expand full comment

Right but economic conditions dictate what schools teach, so skill set demand changes. Dancing might not be prized today, but maybe 10,000 years ago it was for shamanic reasons or mating etc. (you get the idea). So on a genetic/evolutionary front people may have skillsets that are expressions of “genius” but in the current moment they don’t have the same economic value that we equate with “intelligence”. It’s just semantics and context.

Expand full comment

Even in today's society, I would predict a correlation between being good at dancing and reproductive success...

Expand full comment

When I was tested for ADHD they were specifically looking for this sort of subgroup divergence, so I believe that's a known thing.

Expand full comment

The problem is once you start seperating "intelligence" from "scores well in an IQ test" the whole analysis rather falls apart, since we don't have any other measure. Really this should be called "genes for IQ test score vs genes for educational attainment."

Expand full comment

I would define intelligence as g. Intelligence tests (and test items) differ in their g loadings, so It think it should be possible to associate genes with g instead of with test scores.

Expand full comment

IQ is the best measure but we have other measures. You can use any cognitively demanding task but some are more accurate and quicker than others. In any set of cognitively demanding tasks, you can find the g-factor. It is why people can use the SAT, ACT, ASVAB, or whatever and the scores will highly correlate and reflect intelligence. Charles Spearman called this the indifference of the indicator. [1]

[1] https://en.wikipedia.org/wiki/G_factor_(psychometrics)#"Indifference_of_the_indicator"

Expand full comment

Unless you have some empirical measurement of "intelligence" that isn't functionally equivalent to "scores well in a test of intelligence" then any effort to do the separation is going to be mere sterile philosophizing, at best.

Expand full comment

"ADHD and intelligence" is a horrible clusterfuck, because even by psychiatric standards the "thing ADHD describes" is a number of disparate things, and some of those things are even more explicitly social-determination than for most psychiatric terminology. ADHD can refer to hyperactivity, to idiopathic executive dysfunction, to very non-idiopathic executive dysfunction, to "boy", to "not middle-class". It's the wastebasket diagnosis to end wastebasket diagnosis.

Artificially low IQ results are a fairly common issue in the sufficiently neurodivergent population (their applicability to autism is notorious, see the writings of Donna Williams and Scott Aaronson on their own results; I suspect some of the negative correlation between IQ and schizospec neurotypes is due to a tendency to think in much curlier lines than those tests permit), but they're much more difficult to pattern over something as broad as ADHD. Some people with ADHD definitely have artifically low results as a consequence of their neurotypes, but these may be better seen through a different lens.

Expand full comment

I have a kid with ADHD-Inattentive and bipolar. His ultimate level of educational attainment probably will not be high, although he is pretty bright (good writer, bad at math, horrible at history). His docs have told me that many ADHD kids are delayed in terms of maturity, which sometimes affects decision making (not him, thank god) and sometimes affects initiative, self-discipline, and attention to deadlines (definitely him). So, he may in time manage college, but it won't be soon.

By the way, there is nothing good about having bipolar disorder. What's good is the medication that keeps my son out of the hospital or worse.

Expand full comment

Robert Sapolsky mentions in this lecture that a *little* bit of controllable oddness on the schizophrenia spectrum is useful for making usefully charismatic shamans, and that could explain its genetic advantage and persistence. The more extreme, uncontrollable end of the spectrum, on the other hand, isn't an advantage.

https://youtu.be/4WwAQqWUkpI

(Also, this is the one lecture from this series that Sapolsky has removed from his own YouTube channel. He says some highly cancelable shit in this one, which is a shame because it sure does seem to make sense to me.)

Expand full comment

+1 for Sapolsky

Expand full comment

I have often wondered if the various "mental illness" diagnosis that get made are only picking up on the people that are so far out of the norm that they register as a problem. Crochety old people, eccentric uncles, the "socially awkward" and so on probably have similar traits, but at more controllable levels. Millions of other people could have the benefits of the underlying cause, without stretching too far into a problem. We don't have the sophistication needed to identify someone who is *mildly* schizophrenic in a very positive way. We would probably label them as something else or not label them at all.

Expand full comment

“Weird”!

Expand full comment

I am pretty sure that some of things that are mostly harmless nowadays would end with me being dead if I lived say 300 years ago, and if "mental illness" diagnosis criteria would be applied then I would qualify.

(though - both me and my mother would be dead due to birth-related issues if I would be living 300 years ago)

Expand full comment

"Again: you find that having more mutational load, more deleterious mutations, increased your chance of schizophrenia, or autism, or low IQ: that strongly suggests schizophrenia, autism , and low IQ are not the far edge of some strategy. Note: people talking about shamans and schiz: you’re probably wrong. Same for autism – not a strategy."

https://westhunt.wordpress.com/2018/07/22/more-theory/

Sapolsky also wrote a book saying zebras don't get ulcers due to a lack of stress well after Barry Marshall showed ulcers were caused by a bacterial infection.

Expand full comment

> Depression is just bad. I strongly recommend not having it. Don’t even have any risk genes, if you can avoid it.

Couldn’t depression genes be like sickle cell anemia? Eg one depression gene, say, makes you more introspective, two makes you so introspective you fixate on everything bad?

Expand full comment

I’m struggling to understand the chart in ways that translate to common English. Is it fair to say that this chart suggests:

1) Noncognitive genes are more correlated with Academic Achievement than cognitive genes in the presence of high Big 5 Personality stats?

2) Schizophrenia and Bipolar disorder correlates to more academic success in people with more NonCog genes? So Cog genes + Schizophrenia = no correlation to academic success?

3) Noncognitive genes and academic success is correlated with significantly higher longevity as compared to Cognitive genes and academic success?

Expand full comment

My understanding of the chart is that the position of the dot on the graph says whether the bucket of traits that determine cog or noncog achievement correlate with each trait. To tick of each point:

1) For the personality traits it depends on the specific trait, e.g. for openness noncog correlates stronger than cog but they're about the same strength for agreeableness (though directionally different)

2) The way I would put it is that the noncognitive genes linked to academic success are positively correlated with schizophrenia and cognitive genes linked to academic success are negatively correlated with schitzophrenia. The statement you made isn't really something the graph addresses.

3) Best way to put it is that noncog genes linked to academic success are statistically significantly more correlated with longevity than cog genes linked to academic success.

Expand full comment

Hmm okay I think I don’t have a good intuition about how to conceive of the 3 variable correlation going on here (ie Schizophrenia, NonCog, and Academic Success all correlated). So in regards to 1) could you say that based on this data a person with more NonCog genes and a high degree of agreeableness is more likely to have higher academic achievements than someone else with Cog genes but low trait agreeableness?

Expand full comment

I think I'm explaining this badly. The two objects of study are "noncog genes linked to academic achievement" and "cog genes linked to academic achievement." These are groups of genes analyzed in earlier studies in the literature and found to correlate to academic achievement.

Another thing about this type of research is that it doesn't really work on the individual level. You can say that in a population (which is sufficiently large, has a normal distribution of genes, and other simplifying assumptions) the people who are most academically successful are likely to have these cog and noncog genes and that people in the population who have these particular noncog genes are more likely to have schitzophrenia.

If you know which genes a particular person has and want to figure out their expected academic achievement, we basically can't do that because we don't know what effect these genes have on individuals or how that mechanism works; we only know how these genes express themselves in populations.

Expand full comment

Understood. I think I’m trying to force an explanation that isn’t there to make it more intuitive for me.

Expand full comment

Depression is being selected FOR in Sweden. While women with clinical depression average fewer children, their almost depressed sisters have increased fertility, which outweighs the cost to the clinical cases. Thus, evolutionarily speaking, the optimal point for female depression (not male) is higher than the current rate. That is, selection is increasing the sex difference in depression.

Of course, this says nothing of why this might be the case. Considering that women in every place and country are more depressive than men, this pattern must have originated a long time ago. Would be interesting to study mental illness, insofar as it exists (!?), in primates. Are female primates more moody and depressive than males?

https://twitter.com/KirkegaardEmil/status/1071512978695053312

Expand full comment

Are there more women at the extreme of happiness? If that were true, women as a whole may have more variation in mood. This would be unusual because men have higher variability typically.

Expand full comment

Probably. Even among dogs, it is commonly known that female dogs tend to be more moody and sensitive than male dogs. Though the difference isn't large, male dogs are more prone to being generally happy-go-lucky while females are more likely to sulk, act jealous, react badly to punishment, etc. These are attributed to increased situational awareness and sensitivity to the environment and social cues, which would seem to be obviously important traits for the sex that is primarily responsible for making sure the young don't die. I would expect you would see a stronger sex divergency depending on (1) how long the period of maternal care is after birth before juvenile independence, and (2) level of paternal care-giving (which in most mammals is zero). Also, women become much more prone to depression in the years after giving birth. It seems intuitively obvious to me that increased worry, anxiety, and seeing bad stuff everywhere would all be advantageous for protecting offspring and making sure they don't wander off a cliff, eat poison berries, or get eaten.

Expand full comment

I second this. I can see a number of reasons why genes for depression and ADHD might be selected for specifically in women.

A slightly grim outlook might indeed keep your offspring safer. Also, if you benefit personally from the group staying put instead of roaming around, it would make sense for evolution to make you “heavier”, in the sense of dragging a weight. It wouldn’t even need to be all that dramatic to buy you a few extra days in camp. I’d go out on a limb and say maybe this has something to do with humans eventually forming permanent settlements.

Also it’s clear to me that mild ADHD can be really helpful if you’re a mom. I’ve thought about more actively managing my mild ADHD for years, but as a mother of small children I’ve decided to hold off because *it’s just so damn handy to have my brain work this way right now.*

Expand full comment

Yeah, it's a stereotype, but you know what they say about those. It is *so* common when I'm hanging out with my couple friends with young kids that dad is letting the kid do [fill in the blank] while mom is yelling "slow down! Watch out! Don't let him get too close to the edge!" etc. Mothers seem to obviously perceive more danger and risk of harm than the dads do, and I'm not sure how enhanced sensitivity to potential bad things more wouldn't create an increased chance of depression.

Expand full comment

In my very small bubble, I’d say the dads are actually more overprotective on the playground. But all the mental toggling required in childcare clearly wears me out less than than it wears out my husband. His superior ability to focus makes him great at a lot of stuff, and he’s a great dad, but after an afternoon with the kids he’s usually exhausted in a way I’m not.

I think my ADHD is working in my favor here while it used to mostly work against me. When the kids get older and I go back to work, I might want to try some Ritalin.

It also fits that most people find their kids intrinsically interesting.

Expand full comment

There's a theory that the evolutionary reason for suicide is that when a person kills themselves there are more resources for their relatives. If this is true, it would probably make sense that a person with many fertile siblings to be more likely to commit suicide.

Consider a person with no siblings. If they commit suicide there are no siblings to benefit from the extra resources.

On the other hand if a person with many nephews and nieces commits suicide, there are many relatives that would benefit from the extra resources.

Expand full comment

Ever since I learned that Albert Einstein’s son Eduard had schizophrenia to a level that required him to be institutionalized, I sort of assumed that schizophrenia must have something to do with genius.

Also because—take it with the usual corrections for self-reporting—I consider myself to be a genius. My family tree on my father’s side is full of schizotypal personalities. I strongly suspect that being Mennonite, chased to the coldest edges of Europe and beyond because of one’s deeply held esoteric beliefs, selects for this trait.

I remember a post from you about this, Scott (found it: reviewing Surfing Uncertainty https://slatestarcodex.com/2017/09/05/book-review-surfing-uncertainty/). You talked about schizophrenia as a hyperaffective reaction disorder to predictive modelling errors, as opposed to autism which produces hypersensitivity in the models themselves. To put it another way, schizophrenia, in the predictive-modelling paradigm, is a disease that makes it difficult to ignore surprises.

If someone lives their lives constantly having their models challenged, and having their attention pulled to every little model-error, it creates a lot of pressure to build better models, right? If your models get better, you get fewer errors and your mental life becomes easier to manage. And those are the models that, if you happen to be able to communicate them well, become PhD material.

On the other hand, if your models don’t get better, either because you were unlucky, not cognitively flexible enough, or the inundation of model-errors was just too much, the result is something that looks more like schizophrenia.

It makes sense to me that cranking up the genetic dial labeled “schizophrenia” increases the risk-reward of cognition: either you will produce surprising insights and breakthroughs that greatly simplify your (and maybe others’) predictive models, or you will struggle to cope with the basic demands of life and lurch spasmodically from breakdown to breakdown. Or both.

The hypothesis, then, is that Albert Einstein had the right mix of schizophrenia and cognition-enhancing genes to use his irritation at model-errors effectively; Eduard got the irritation but not enough of the cognitive mechanisms to handle it.

Expand full comment

"I have been saying for years that I think some of the genes for some mental illnesses must have compensatory benefits. Everyone else said that was dumb, they’re mostly selected against and decrease IQ."

One obvious answer to the "if a gene is bad, it can't be selected for" argument is that a gene can be good in a heterozygous genotype and bad in a homozygous genotype. Sickle Cell being the classic example. One copy of the gene gives malaria resistance; two copies gives you a deadly blood disease.

I don't know much about exactly how these GWAS studies are done. But when they calculate the raw correlations between genes and phenotypes are they able to separate out the heterozygous and homozygous occurrences of the genes?

It seems pretty important. For example, in theory, a gene's hugely positive heterozygous effects might exactly cancel out its hugely negative homozygous effects. So that it's raw statistical correlation to the trait is essentially zero. (This is akin to the man with one foot in boiling water and one foot in freezing water, who is "on average" experiencing a comfortable temperature).

Expand full comment

IQ is a polygenic trait though.

Expand full comment

Sure. But that doesn't mean that any of the specific genes contributing to IQ can't have a different effect where the person has only one, instead of two, copies of the gene.

Expand full comment

Sickle cell is protective against a very high-fitness cost disease whose prevalence increased relatively recently when forests got cleared, creating lots of standing water for mosquitos to breed in. That's unusual. Most deleterious genes don't have those kinds of benefits (though there are genes causing kidney problems in Africans that also protect against sleeping disease spread by tsetse flies).

Expand full comment

That's actually my point. It is "unusual" for a gene that causes disease to persist in the gene pool without being selected out. And that is precisely why the persistence of a disease-causing gene implies that it must also have a separate countervailing positive effect within the overall gene pool.

Thus, if genes for mental illness have not been selected out of the gene pool by now, we can infer that those same genes must also carry some selection advantage. And the difference between heterozygous and homozygous gene expression is a clear mechanism to explain how the same gene can be alternately deleterious or beneficial.

The exact reason a gene is deleterious or beneficial in a given environment (e.g., cleared forests or whatever) is irrelevant. The prevalence of the sometimes-beneficial/sometimes-deleterious gene will just work itself out through the natural selection process.

Expand full comment

This isn't one single gene of large effect being more common than one would expect via de novo mutations. These are lots of genes of small effect, which is explainable via mutational load. Natural selection keeps purging them, but more pop up.

Expand full comment

Your theory is that mental illness genes are just random negative mutations that coincidentally correlate with higher IQ. That's possible. But that's the exact argument Scott disagrees with, and I think he is probably right.

Like I said in my original post, your hypothesis can be tested by separating out the effects of the genes in their homozygous and heterozygous expressions. My point is that someone should do this.

Expand full comment

Mental illness genes don't correlate with higher IQ. The notable finding is that there are genes correlated with educational attainment but NOT IQ, and those tend to be correlated with mental illness.

Expand full comment

I feel like describing this as "intelligence" vs educational attainment may be misleading, since its actually measuring "IQ test scores" vs educational attainment as I understand it. Not to get into the whole general IQ debate, but to the extent that IQ is useful as a large scale proxy for intelligence, that becomes less meaningful when you are comparing it to another proxy for intelligence. Given the nature of IQ tests that makes some of the correlations less surprising.

Expand full comment

The correlation between educational outcomes and cognitive performance (Cog) isn't surprising. What they did was look at the genetic variation in educational outcomes not explained by genetic variation in cognitive ability (NonCog). This left-over NonCog component is actually pretty interesting.

Expand full comment

Re: Math Phds and autism, I am assuming it was written as a joke, but when I read it I was initially like “sounds about right” and then I thought about all the math grad students and professors I know and realized that none of them seem actually autistic. There is definitely something atypical about most math people I know, but it seems varried and generally not autism.

Do people know stats on this? It might also be because I mainly see academics, and teaching feels like something that would select against autism, so maybe there are more autistic math PhDs out in industry?

Expand full comment

I remember reading that yes, mathematicians do get positive autism tests at higher rates than normal population. But there was also discussion about the validity of the testing for certain subpopulations. In particular, when testing autism it's usually done with some autism spectrum index. The index is a construct: you check all boxes that apply and the doctor says "you have autism". Of course the idea is that the construct correlates well with Actual Autism, but the index is calibrated on the general population and not on highly selective subpopulations such as math PhDs. I think math PhDs are weird people in a lot of ways, and coincidentally some of those weirdnesses overlap with the general population autism spectrum index.

Expand full comment

My anecdotal experience as a Physics grad student was that autism-like traits (including a couple of very obvious cases) were more common than gen pop, but that still translates into a low prevalence in absolute terms - it's not that surprising to find a department without any autistic people?

Expand full comment

As a chemistry PhD I can say that being on the Autism SPECTRUM is certainly correlated with the math intensity of STEM faculty....

Expand full comment

There is a theory that the evolutionary reason for suicide is that when a person kills themselves there are more resources, (like food) for their relatives. So depression might exist to cause suicide.

I have the theory that this is right except that the gene for suicide is not in the suicidal person but in the mother of the suicidal person. That it is not a gene for being suicidal, but a gene for making your child suicidal. The child is affected when it in the mother's womb.

The benefit of my theory from the older theory is that the evolutionary drawback of the suicide is halved, as the person committing suicide only has half the genes of the mother. But the evolutionary benefit of the suicide stays the same.

I made a comic about it: http://evolutions.thecomicseries.com/

Expand full comment

We need more on verbal tilt + psychopathology + belief system, digit ratio + GNC + bone structures + Life History, or blood type + hyperopia + intelligence + SES.

Expand full comment

Very nice post. A note on the schizophrenia result and false discovery. In figure 4 they are graphing 95% confidence intervals (plus or minus 2 standard deviations) and if you look at the confidence intervals for schizophrenia they are very far apart. If you tripled the size of the confidence intervals (plus or minus 6 standard deviations) they still would not overlap. Assuming everything is normally distributed (which they have already done, their confidence intervals rely on this) we can get a nonoptimal bound that if we run a billion trials, we expect to see a false positive in less than 2 out of a billion tries. (This number should not be taken literally because the approximation of the data by the normal distribution will not have this level of accuracy, and there are other possible errors in the study. However it accurately expresses that seeing an effect of this size simply due to a false positive from running many comparisons is exceedingly unlikely.) In the paper they also list a P_{diff_fdr}<.001 which I think is supposed to be the P value of the schizophrenia result after taking into account the risk of false discovery. I include the simple and loose analysis above simply to demonstrate that even if I am misinterpreting their P value, from the visual of the confidence intervals alone, false discovery seems very unlikely here.

Expand full comment

There is a theory that schizophrenia is a side effect of human self domestication. Which of course means you'd need to buy into the self domestication hypothesis first but it would explain how we got schizophrenia.

Expand full comment

Why would it be a side effect of human self-domestication?

Expand full comment

Basically, that it's a side effect of selecting for self domestication. This presents the basic theory - https://www.psychologytoday.com/us/blog/the-imprinted-brain/201609/schizophrenics-hyper-domesticated-humans. Not sure I buy it but I found it interesting.

Expand full comment

"... whether you so desperately seek societal approval that you're willing to throw away your entire twenties on a PhD with no job prospects at the end of it."

Ouch, man. That hurts.

Expand full comment

The payoff is in gaining matches on Tinder. PHD? 80% more matches, for a guy

Expand full comment

I wish this stereotype that PhDs have no job prospects would die or at least get toned down a bit. There are plenty of scientific fields with good job prospects for PhDs outside as well as inside of academia (maybe not straight biology, though).

Expand full comment

Science (or STEM more generally) is the exception, I think, and even there I don't think a PhD is helping you compared to a Masters, given how many extra years it takes

Expand full comment

Maybe in computer science. In science masters degrees are usually worthless. Some companies may use it as a qualification for a lower tier job than what you get with a PhD, and mayyybe there's potential for upward mobility, but I don't think it's that common.

Expand full comment
founding

Generalizing quite a bit:

S - Academia requires a Ph.D for any of the good jobs (tenure-track faculty or researcher at one of the non-university research institutes), but it produces at least five times as many Ph.D.s as there are such positions to fill. *If* your brand of Science has a strong industrial component (Chemistry yes, Astronomy no), then a Ph.D. is likely to get you a decent job.

T - Lots of industry jobs, but they're almost all open to anyone with an MS - if you put in the 3+ extra years for the doctorate, slight chance you get to be e.g. a Comp Sci professor, more likely you're just getting societal approval and/or personal fulfillment.

E- See T

M - There are jobs for which the Ph.D. opens the door, but aside from the occasional math-professor gig they're mostly for e.g. three-letter agencies or financial firms, which is probably not what you had in mind when you decided to become a mathematician.

Expand full comment

Social Approval is a non-cog and verbal tilt trait, and de-coupled from intelligence.

Expand full comment

Just came to the comments to see whether anyone reacted to this. Is this sentence a joke or does Scott really think that about getting a PhD in general? No job prospects at the end of PhD, what?

Expand full comment

The lack of job prospects is certainly true for humanities PhD programs, which I was in until I had the good sense to drop out. Hence my “ouch”.

Expand full comment

If functional mental disorders are primarily due to evolutionary mismatch, then there is no need to explain the benefits of "mental illness genes." While some people are genetically more prone to adult onset diabetes than are others, in the environment of evolutionary adaptation such diabetes was rare. We don't try to explain the benefits of diabetes genes. Likewise, while some people are more genetically prone to functional mental disorders than are others, in the environment of evolutionary adaptation it seems likely that such disorders were likewise rare.

Durkheim's research on the higher rates of suicide among Protestants than Catholics was the beginning of a long tradition of empirical research relating to the phenomenon of anomie, as opposed to cultural embeddedness with strong social roles and bonds, leading to adverse mental health outcomes. Here is a recent study (n = 8446) on how increased exposure to US culture increased suicide attempts among youth in the Dominican Republic,

"The increases in the propensity to attempt suicide for DR youths across these US cultural involvement indicators were both robust and large. For example, the propensity to attempt suicide ranged from 6.3% for those at the lowest end of the range of use of US electronic media and language to 13.3% for those at the highest end of the range of use of US electronic media and language. This central finding is congruent with the lower suicide or suicide attempt rates found for first-generation or less acculturated Latinos across multiple national and regional cohorts of Latinos."

Liah Greenfeld's "Mind, Modernity, and Madness" provides a neo-Durkheimian account that provides a coherent explanation for how increasing levels of anomie in modernity lead to increased rates of depression, bipolar, and schizophrenia. In traditional cultures, with humanity in the environment of evolutionary adaptation being the most "traditional" societies, there was neither need nor opportunity to construct a personal identity. A human being was one's roles. There was no "I" in the modern sense (cf. Julian Jaynes). One was unaware of the water in which one was swimming. Now we are all fish out of water, flopping around, gills desperately sucking in air in an attempt to maintain mental stability. For some of us it is easy, for others very difficult. The genetic material of fish works just fine in the water.

While Scott is not sympathetic to this explanation, I've never seen compelling evidence that shows that functional mental disorders were not due to evolutionary mismatch. Yes, whether or not functional mental orders have increased over time and across cultures or not remains contested. Depending on one's priors, the burden of proof shifts.

But studies of suicide provide less contested evidence that culture is a major influence on suicide rates. As far as I know, all such comparative studies are consistent with greater anomie, greater burden on constructing one's own identity (as opposed to a relative lack of the need to create an identity in more traditional cultures) resulting in higher rates of suicide.

Human beings evolved over many millions of years in diverse physical environments. But with respect to social structure, until the dawn of agriculture and empire, almost all adolescents:

1. Lived in a small tribal community of a few dozen to a few hundred with few interactions with other tribal groups.

2. These tribes would have shared one language, one belief system, one set of norms, one morality, and more generally a social and cultural homogeneity that is unimaginable for us today.

3. They would have been immersed in a community with a full range of ages present, from child to elder.

4. From childhood they would have been engaged in the work of the community, typically hunting and gathering, with full adult responsibilities typically being associated with puberty.

5. Their mating and status competitions would have mostly been within their tribe or occasionally with nearby groups, most of which would have been highly similar to themselves.

Could the dramatic divergence from the environment of evolutionary adaptation in any or all of these socio-cultural features result in increased "mental illness" for a genetic subset of human populations?

https://flowidealism.medium.com/evolutionary-mismatch-as-a-causal-factor-in-adolescent-dysfunction-and-mental-illness-d235cc85584

Expand full comment

" I've never seen compelling evidence that shows that functional mental disorders were not due to evolutionary mismatch."

It seems to me an unlikely general explanantion, although is certainly is part of the explanation for, for example, high level of drepression in modern, societies.

But it seems to me quite unlikely that severe enough autism or schizophrenia either would not appear or would not be deleterious in the environment of evolutionary adaptation.

Expand full comment

"Their genetic measure of non-cognitive skills... was still correlated at r = 0.31 with IQ"

Note that this is also a *genetic* correlation - the genetic influences of NonCog correlate at r = 0.31 with the genetic influences of IQ, not with actual measured IQ. The same is true for the Cog/NonCog relationship you mention with self-reported math ability and highest math class taken. (Also, assortative mating would inflate these correlations.)

Expand full comment

Oh, that is a very good point. So these are basically "brain is broken, makes you both stupid and lazy/whatever" genes, I suppose.

Expand full comment

Delay discounting is worse than alcoholism? I interpret this not as meaning that delay discounting is bad for success, but that high educational attainment is bad for success. Huge delay discounting = not going to college because college takes a long time (not obviously wrong); no delay discounting = being willing to go to college for 12 years to get a marginally nicer job (obviously wrong).

Expand full comment

The way they use correlation assumes that all personality traits have linear effects on whatever they're measuring. If the function relating {number of genes "for" a behavior} to {educational attainment or IQ} has a tall U-shaped (or upside-down-U-shaped) curve, as some might, the correlation results will depend mainly on the outliers. For example, if a little conscientiousness helps you finish college, but a whole lot makes you such a perfectionist that you're likely to fail college, then the "correlation" between conscientiousness and finishing college isn't telling you how strong the effect of conscientiousness on finishing college is; it's telling you something about the skew of the U-shaped function.

Expand full comment

For some reason, when people talk about genetics they like to forget that correlation is not causation.

Scott, you are usually vigilant for distrusting "correlation but we adjusted for confounders" studies. This is the same type of study!

What are "genes for intelligence"? In this context, they are genes that are correlated with intelligence. A gene that purely causes black skin (but does literally nothing else) would be counted as if it decreased IQ, because statistically people with black skin do worse on IQ tests.

There is no end to the number of possible confounders here; there is no exogenous source of randomness at all.

Expand full comment

In general: I've often thought that a lot of people go to college and get phds just because they are familiar and comfortable with school and scared or unsure about entering the 'real world' (certainly true for me). I've often thought that a lot of the apparently high levels of mental weirdness in phd programs are largely related to this - who wants to effectively extend their childhood by staying in school, vs who is ready to 'grow up' and enter the real world.

Expand full comment

FWIW, I know several people who got Ph.D.s in order to be able to pursue their favorite research. Of course, arguably, the desire to understand esoteric properties of nature is also more childish than the desire to settle down with a family and 2.5 children...

Expand full comment

I have some experience living with people diagnosed as schizophrenics. I put it that way because I'm not sure how well understood schizophrenia is and how confident we can be in a clinical diagnosis.

Other than living with diagnosed schizophrenics, I know very little about the condition. My experience is that schizophrenics live in a fantasy world. They are unable to tell the difference between the real world of experience and the world inside their head. I know one in particular who can talk at length of his day to day life in Vietnam, when in fact he has never been there. He appears to be unable to distinguish between fantasy and reality.

On the other hand, my impression of highly successful scientists, engineers, mathematicians, linguists, etc... is they can build mental maps that enable them to navigate their special practices. It seems to me they comprehend very complicated systems, such as molecular biology or microelectronics using some type of mapping onto real world experiences.

What I mean to say is what schizophrenics and mathematicians, e.g., have in common is the ability to live in an alternate reality. I would say visualizing the complex folding of a protein molecule is not too far removed from imagining that you are in fact Sgt. Barry Sadler in 1969.

Expand full comment

As a high-IQ person who withdrew from college for mental health reasons, this is really interesting personally

Expand full comment

> you can't make a six digit number of people sit down and take IQ tests.

Are you kidding? This happens multiple times a year.

Expand full comment

> you can't make a six digit number of people sit down and take IQ tests.

Actually: https://biobank.ndph.ox.ac.uk/ukb/field.cgi?id=20016

But in general, you're right.

It's amazing how dominant UK Biobank is in genetics right now (at least behavioural genetics). I was at the IGSS conference (https://cupc.colorado.edu/conferences/IGSS_2021/) and more than half the presentations were using that data.

Expand full comment

+1 I'm sure there are better ways of measuring intelligence out there, but they take more than 45 minutes per subject and a scantron.

Expand full comment

ICAR16. 15 minutes and done.

Expand full comment

Overall, I'd be worried about confounds here. We have a very noisy measure of "genes for IQ" - noisy because GWAS is noisy, and because the IQ measure itself is noisy (just a quick 12 point test IIRC). Then we deduct that from "genes for educational attainment". What's left? Maybe "genes for non-cognitive skills". But maybe "genes for IQ, that we didn't measure very well". Indeed, "non-cognitive PGS" predicts IQ.... And then there are all the possible environmental confounds. I think I'd rather see a measure of non-cognitive skills and then a GWAS that targets that directly.

However, that is just a lazy first take, and I should stop shooting my mouth off about a coauthor's paper.

Expand full comment

Counter proposition: EA genes are a superset of IQ genes, so social skills and creative skills would be part of this EA-IQ set. But then we would see Psychosis tilt (lack of autism), Extraversion, Agreeableness, Conscientiousness, Longevity, Feminization (older mothers, tied to GNC) being part of a bundle?

Expand full comment

Not sure if you care, but autism and ADHD are not considered "mental illnesses".

Expand full comment

If it's in the DSM, doesn't that mean it's a mental illness? ...

Therefore, getting your condition voted out of the DSM mean that yo were "curred"?

Expand full comment

If schizophrenia genes increase education, that would also be fitness reducing.

Expand full comment

There's a lot of inference to worry about here, but I'm already stuck on this, from the paper: "By construction, NonCog genetic variance was independent of Cog genetic variance (r_g = 0)." What sense of "independence" follows from zero covariance? That's pretty clearly going to create some weird conditional relationships between their SNPs' imputed Cog and NonCog scores to maintain zero correlation. It seems they recognize this in the supplement but they don't really bother to interpret it.

Expand full comment

I ask this because I follow Charles Murray on twitter in order to argue with his position that there is a race-iq causation - has anyone dared to see if there's a variance of the cognitive genes vs ethnicity?

Seems like this database would provide quite strong empirical evidence on the unfortunate topic

Expand full comment

Theres already like 3 admixture analysis been done...even controlling for skin color and income you get the same result, more euro ancestry better results and vice versa. Truly horrifying topic to be honest and I was just really disturbed getting into it myself. Hoping to God there's a way out of this mess.

Expand full comment

Interesting, does that apply relative to Eurasian and/or East Asian ancestry as well? It wouldn't surprise me if there was something we don't know about the correlations that gets figured out eventually. 'My' theory (Thomas Sowell's theory) has been that historical access to the large east-west landmass of Eurasia and things like density of ports, fertile plains and navigable waterways is the main proximate cause for culturally driven differential IQ results

I understand Murray does not believe that a genetic effect would mean anything about the value of an individual and certainly not groups but I fear the social effect of a widespread belief, whether reality based or not, of there being population genetic differences. The comments his tweets on the subject get suggest he isn't right to think there's nothing to be concerned about

Expand full comment

Am I reading the chart correctly to note that SNPs associated with higher cognition are *negatively correlated* with conscientiousness (and extraversion, and agreeableness)? That's absolutely fascinating given that educational attainment (and a whole bunch of other traits associated with success, like wealth and income) are so strongly associated with conscientiousness, and suggests at least to me that most of the difference between educational attainment-promoting and cognition-promoting genes should logically have something to do with the trait.

It also seems like a fascinating counterexample to the idea you see brought up often in population genetics that "all good things are correlated". We see a discussion of a g factor, which includes all of cognition, often, and it seems frequently mooted that g itself is part of h, a general health factor. (Also, while I would certainly associate high-cognition-but-low-educational-attainment individuals with low conscientiousness, it boggles the mind to suggest that low conscientiousness, by itself, is associated with higher cognition. Why should that be true? Where's the tradeoff?)

Assuming I am reading the graph correctly, and blue is just 'high cognition', and not something like 'SNPs promoting high cognition but not educational attainment'. I could go back and try to read the chart more carefully, or read the article the chart comes from, but I think I'm too low conscientiousness to go and do so. One read-through is enough.

Expand full comment

I've never been tested for ADHD and I don't know if I'll ever bother as my country has a dismal record of treating it, but I strongly suspect I have it. In any case I am extremely hyperactive and have been for a long time.

My life was basically total chaos up to a point. I finished university with a very mediocre result (around 2.4 I think). I aced courses that were interesting to me, and I failed Statistics 101 TWO times until I got angry the third time and got a B+.

Then I was lucky enough to find a technique that works for me - a sort of a Zen meditation where you just sit without moving for a long time. It doesn't remove my ADHD symptoms, but it calms me down a lot and allows me to focus and work.

I am a programmer now and I think there are some advantages, but only because I am treating it in some way - otherwise it would just be a total burden.

The advantages:

- I am better than others in scanning a large amount of code in a search for some obscure problem. In more general sense, I am just better in scanning in general - if I have some list in front of me and I need to find something, I do it faster than others.

- When something gets interesting to me, I feel like my mind gets totally obsessed and get actually angry if someone distracts me - I have shouted at people for that. This might sound bad, but allows me to do fast analysis of a large amount of data.

Disadvantages:

- I have a problem when I need to slow down and focus on one thing. I just cannot motivate myself to study, I have tried for years. The only way I learn new information is when I actively write code, because it is interesting and not boring.

- If I start procrastinating even a little it is very possible that my whole day will be spent in Reddit and YouTube - basically when my brain gets interested in some bullshit many hours can pass before I can stop myself.

- I work well when my work is interesting and badly when it is not. For example, it is interesting to write some new feature, less interesting to make a strategy to test it, so that might take much more time than it has to.

Hope this was interesting if not useful.

Expand full comment

Resonates with me but...its kind of worrying? I have the strengths you mentioned, plus the weaknesses and I was hoping thered be a way to move beyond just leaning into my strengths. I suppose I could marry a lady who has the opposite of add whatever that is....haha

Oh man it really just kind of is this way. Keep searching for productive interesting things to work on that have a lot of payback

Expand full comment

Can someone explain what "PGS analysis" means in the second graph? Maybe this is a dumb question but a Google search didn't help me at all.

Expand full comment

PolyGenic Score. It is your predicted phenotype from your genotypre according to a genome-wide association study. Because the PGS over the whole sample has an R^2 of something like 0.05 to 0.3, it is a very noisy predictor of phenotype.

Sometimes people use PGRS, where the ‘R’ is for risk, because GWAS was first used for diseases.

Expand full comment

Hypothesis: schizophrenics have a better connection to the supernatural, which is actually real. They're the most misunderstood mentally ill people, except maybe cluster B folks. I'm sorry to admit this hypothesis is more than a little bit inspired by stereotypes about chosen people.

Expand full comment

There's a history of schizophrenia in my family. There's also a history of genius, creativity and outrageous financial success. The money is nice. But I miss my brother.

Expand full comment

There is now genetic and epidemiological evidence that Autism and mental illness genes overlap (especially with ADHD, parkinsons). But this is not being communicated with the public. How long has this been covered up? I can't believe that in almost 100 years of autism research this has not been noticed or investigated until now.

Expand full comment

Kirkegaard: "Told you about Verbal Tilt yo!" (/s)

But notice how verbal vs math is similar to non-cog vs cog or verbal tilt vs IQ, that could be elaborated upon?

Expand full comment