Pretty much everything in the body is the same. Take out 100% of X, and you die horribly, whether it's a vitamin or a microelement. This doesn't mean that having extra X is a good thing - sometimes it's harmless, sometimes it kills you just as horribly (you should have seen the look on my doctor's face when she saw my potassium levels).
What we're doing here is looking for the very few X-es where supplementing is doing a net good with a low risk. That's a more complex issue, because it's basically a question of environment and luck. If X is too low, why? Humans have evolved to thrive on close to the most varied diets in the animal kingdom, so how come we don't get enough of that specific "X"? The reason put forward for vitamin D was light exposure, and is part of the reason why it's a lot more studied than something random by cobalt. But we'd have expected to find something by now.
> I’m not sure why they flip-flop between “lower doses are better” and “lower doses are the same”,
I am not a scientist and I don't read many studies, but it's weird to me how many times I read people like Scott talking about studies and see these sorts of incongruities in the text of the study.
As someone who writes scientific papers, even with multiple co-authors going over a manuscript and sending it out to colleagues for pre-publication review, and peer-review during the publication process, it is still shockingly easy to both make these errors and fail to catch them.
My best-guess explanation is that for small, phrase-level errors (small as in length in the text not in magnitude) like this, when you are familiar with the content and know what it _should_ be, your brain is very very good at skimming it and filling in what it's supposed to be.
As authors we know this (usually learned through experience) and try to catch these mistakes, but in the process of writing, it's likely that dozens get made at some point in one draft or another so the fact that a relatively large number of papers have 1 or 2 make it into the final publication draft does not surprise me that much, as unfortunate as it is.
Seconded. Of course we try to be as careful as possible, and I personally proofread every letter of a paper at least twice just before publication, but as the saying goes: "The total number of errors in your paper is at least N+1, where N is the number of errors you found in proofreading."
My understanding regarding covid was more that there was a correlation between higher vitamin D levels and faring better with the virus, including long covid. Which leads to some experimentation with giving vitamin D as treatment, but really levels need to be up before then. Also, just anecdotally, my friends and I who have thyroid issues all have had lower vitamin D levels less than 30 ng/mL (which is what one study mentioned as a threshold back in 2020) and have all been advised by our doctors to take D3 w/K to supplement daily.
And just another thought is that nothing about our body is boring. Underestimating the mystery we are still exploring is definitely going to lead to us missing something.
For every chemical compound we know about, there are probably a dozen related compounds we don't know about. And for each one of these, there's probably a dozen or so positive effects of increasing the dosage, and another two dozen or so negative effects of increasing the dosage. Medicines IRL are not like video-game meds where more = better.
Scott is making two classic mistakes: First in binary thinking; Second exercising first level thinking.
Outside of perhaps polonium, any chemical is neither good nor bad, there are multiple tradeoffs to dosages. Chemicals interact with each other chemicals, chemical receptors interact with different chemicals, cellular chemical interactions interact with other cellular chemical interactions. Its way complex, too complex to even imagine.
Scott isn't making those mistakes. And despite your last paragraph, we can still make blanket statements about the effect of ingesting certain things by doing well designed studies. Just because something is complex doesn't mean we can't know anything about it.
You're not really making any arguments here - just giving us more anecdotes which are totally useless. If you could produce a study that showed, for people with less than 30 ng/mL, taking the dose you're taking daily would lead to a longer or better quality life in real people, then we have something to discuss. No such study exists, and your anecdotes about having a prescription from your doctor prove nothing.
This works both for and against Vitamin D being relevant to COVID. It could be sunlight generally, warm weather, being outside, Vitamin D, some other sunlight/outdoor/warm weather-related chemical, or a combination of any of these. Of it could be a coincidence or related to some third factor that affects both time spent in the sun and COVID - for instance the fact that healthy people are more likely to be outside, and old people are least likely to spend lots of time outside and also most likely to get sick and die.
Vitamin D is considered a likely culprit because it's an identified chemical that our bodies seem to use for good effect, but it's hardly the only possibility.
Vitamin D levels, since they are linked to sun exposure, get correlated with all sorts of things. Multiple Sclerosis is relatively more common in the Pacific Northwest and one theory is that it has something to do with a lack of vitamin D since the area is often cloudy and rainy.
If you move your weekly pinochle game outdoors, odds are you will be less likely to get COVID if one of the other players is infected. Is it the sunlight and increased vitamin D? Is it the better air circulation than in that stuffy back room? Is it the exercise from chasing the cards when a gust blasts them off the table?
Again Vitamin K supplements and excessive usage should be medically evaluated and I am not a big fan of running to the doctor.. at all.. having worked in medicine. D and C and Zinc are good for immune boosters and also natural sunshine. :) yep.
> If you want to go higher in that range, you can trade off a tiny mostly-theoretical risk of a very mild insufficiency for a tiny and mostly-theoretical risk of a very mild toxicity.
Is there any research about what that risk of toxicity could be? I assume that Hoffman is currently taking huge doses of vitamin D to match his assumptions about ancestral populations. Is there a specific reason to believe that he is in danger here (as opposed to generic "too much of anything can be dangerous" reasons)?
I looked into this a while ago; one of the studies I found looking at case reports of observed toxicity was from https://doi.org/10.1111/cen.12836 (available on sci-hub), whose patients had been given between 30,000 and 200,000 IU per day for at least a month. I haven't found any mention of toxicity from daily doses under 20,000 IU in the literature, but definitely could have missed something.
In discussions of toxicity, I would think some consideration might be given to the propensity of high levels of vitamins D to calcify soft tissue. K2 sufficiency may be a relevant factor in soft tissue calcification.
"5.1. Hypervitaminosis D and VC
Induction of calcification through hypervitaminosis with vitamin D has been demonstrated and well characterised in multiple animal models, including mice, rats, goats and pigs (see Table 1). Treatment of rats with sublethal doses (7.5 mg/kg) of vitamin D plus nicotine produces a lasting 10–40 fold increase in aortic calcium content, resulting in the calcification and destruction of medial elastic fibres, subsequently leading to arterial stiffness [84]. In goats and pigs, dietary supplementation of vitamin D promotes the development of aortic and coronary calcified lesions in association with elevated serum calcium and cholesterol levels [85,86]. Vitamin D induced calcification in mice is currently considered to be one of the more robust models of calcification, in which single doses of 500,000 IU/kg/day can produce severe aortic medial calcification after just 7 days following 3 consecutive days of initial treatment [87]. Interestingly, a recent study produced a variant of this model in which mice initially treated with a lower dose (100,000 IU/kg/day) for 7 consecutive days developed moderate aortic calcification outcomes at 28 days (unpublished). In addition to precursor forms of vitamin D, such as D2 and D3, dosing of its active metabolite, calcitriol, also produces diffuse and widespread soft tissue calcification that has been demonstrated in a time-dependent manner in rats [88]. Despite the number of in vivo models, evidence to explain the clear mechanisms of action by which excess exogenous vitamin D promotes calcification is still lacking."
Toxicology studies are often like that. They try to substitute high doses for long time frames. Yes, it has issues. But so does trying to run an experiment for 40 years.
I would argue that this approach, substituting high doses for long time frames, is inherently useless. I can't see any way to avoid producing blood concentrations higher than you EVER get in the real world, assuming the human body can reliably use what it needs, dispose of some amount, and only store what's left after that point if anything. The fact that the alternative can't be done doesn't make this option correct. If you have no good options, the least bad option doesn't magically become good enough to rely on.
Bouncing off that, summary of my post: seasonality, vaccination status, and hey that long covid risk 3x with Vit D in the study?
One thing is that Vitamin D levels are seasonal, and seasonality of studies should be considered in these analysis. E.G. the RR 0.23 study started in June through Dec 2020, vs. CORONAVIT on ARI prevention started in Dec through June 2021 (unclear, october is mentioned in the text for evaluating participants for inclusion, first month of followup was December (so started in November?). If Vit D takes a while to ramp up, and it's ramping up when cases are highest, that's going to reduce the effect. Measuring vitamin D during the end *during the summer* to compare groups, I did not understand that. I know studies don't have infinite budgets but measuring vitamin D levels is cheap, doable by post it seems, and measuring them at onset of illness or at monthly follow-up would really help understand everything going on.
Vaccination especially right after has a very strong effect on COVID-19 infection and a serious potential confounder. Table S3 reports infections and long covid by vaccination status *at the end of the study* not at time of infection!? When during the study vaccination happened is important but not analysed. Frankly the "participants looked the same % in June" in the appendix IMO was not sufficient, when did vaccination happen affects time with exposure. Yeah yeah it's supposed to be an R CT but most RCT not designed to happen with the biggest intervention known to work concommittantly.
I don't get why no one is talking about the frankly scary 3-4x RR for long covid for those on vit D in the study. Yeah it is noisy and not powered for that outcome but that is scary. I am guessing there is a non-linear effect for too high vitamin D, sadly particularly among women (who suffer from auto-immune disease more). LC risk also potentially confounded again by vaccination status at time of infection.
In general with vitamin D and with anything with threshold/sufficiency/non-linear effects, things are really complicated. Scott's post talking about averages, that is a start, but maybe all this stuff is complicated and not agreed on b/c people have trouble thinking through all that. Any time I see a vitamin D study without an analysis that is at thresholds/sufficiency as well as "on average" it is just weird to me.
That first study was only in older adults (mean age 67 ish), was barely statistically significant, will likely be unreplicated for many years, and only reduced the incidence of autoimmune disease from about 12 per 1,000 to 10 per 1,000 - doesn't seem particularly interesting or surprising.
Yes, there are always ways to cut up the data to get statistical significance in some subgroup or over some time period. You must have missed the part where Scott literally says: "I accept there are Vitamin D receptors on immune cells and a couple of other things, and it probably plays some role in modulating those." It's debatable that the study justifies any positive statement about Vitamin D's ability to prevent autoimmune disease, especially in the absence of any replication or further studies, but no one here is claiming that it doesn't have any effect beyond bones.
I was at 16 ng/ML 2 weeks ago. Started to supplement at 50K IU a week for 3 months, will check my blood again when done. Out of curiosity, if readers here were to make me a market for my Vit D level at the end of this period (even odds buy/sell levels), what would it be?
Are you taking one dose weekly? I don't really have a great citation for you, but the chemically active metabolite is two reactions downstream from the supplement chemical. I'd think there's a risk of the kidneys filtering out A such that the concentration of B eventually falls too low to keep up production of C.
There is folklore that Vitamin D provides some benefit against the winter blues (SAD). A quick Google search has some sites which say the evidence is 'mixed' for this purpose but also that some studies claim Vitamin D increases levels of serotonin in the brain which something something helps with depression.
How credible should I find this theory? I've been taking Vitamin D supplements for years and anecdotally they seem to be helpful (weeks I don't take my supplements I generally feel worse - of course, this could be correlation not causation...), but it also seems quite likely I'm just a master placebomancer.
Well, I guess my response is that I wouldn't say "Balloons help with SAD". I'd say specifically that "balloons help generate a placebo effect that help with SAD". This seems to more accurately represent reality and what people are talking about when they use these words.
I'm not sure. Seasonal Affective Disorder is definitively linked to light input (we know that because really intense light supplementation is an effective therapy.) That automatically puts the prior on Vitamin D being linked to it notably higher than a random chemical.
A decent prior seems appropriate (10% maybe.) I'm not convinced that Vitamin D is helpful, but it wouldn't surprise me if it were. (Whereas it WOULD surprise me if Vitamin D helped with COVID or the flu or heart disease.)
Just like you have a plausible story about how Vitamin D might help SAD, I think probably there are stories you could tell about plausible reasons to believe Vitamin D would help many of the other things people thought it would help? I'm not an expert in the history of Vitamin-D-will-help-this-thing theories.
My prior is that there's infinite numbers of incorrect but plausible stories you can tell about a phenomenon.
You might put Vitamin D in the class of theories that had plausible stories but didn't work out and not in the class of theories that said it'd help with COVID or whatever. My issue is that I'm not sure has-plausible-story should raise your prior much.
Counterpoint: dawn simulators and lightboxes help with SAD, and you're not getting any vitamin D off them.
If anything, that should make us update away from vitamin D doing anything - the causal agent is something related to sunlight, like... just perceiving light. Or melatonin and fixing the circadian rhythm (my bet). Or something else entirely.
Can you tell that they don't emit "significant amounts" of UVB?
It's a "strong carcinogen" – but not _that_ strong; not literally 'don't ever go outside during the day if you can help it' strong.
I find the standard warnings about 'tanning beds' to be modestly convincing!
But I suspect that there are 'sunlamps' that provide UVB like/similar-to sunlight, and so should mimic whatever effects UVB has on us thru exposure to sunlight.
It wouldn't surprise *me* if vitamin D were linked with reactions to COVID, the flu, or heart disease. But the effect could go either way. (From the above comments I'd expect that high levels of vitamin D might be linked to increased probability of heart problems. Inflexible blood vessels don't sound good.)
"In any case it's cheap and not likely to be toxic. Gonna keep taking my Pascal's wager in the form of my 4000 IU/d."
This is the part I don't really understand. To quote Tim Meadows, "it's the cheapest drug there is." So maybe it's beneficial. It can't hurt you (marijuana can folks, great joke but not equivalent), it costs nothing, takes a half second out of your day. Even if the 50 studies that all generally indicate it's beneficial are all somehow just misinterpreting a boring bone drug, meh.
Go read the comments about tissue calcification. This means calcifying your arteries and you die of an aneurism or heart attack. Some studies killed rats within a week.
Within a week yes, but of doses thousands times higher than any discussed here. I would have to somehow daily consume three containers of 365 tablets containing 4000IU each to reach anywhere near the doses in the calcification study. I'm going to ignore that study, the same way I ignore any study where I would have to daily ingest my body weight in iron filings, drink a bathful of water, or munch a cubic meter of shredded wheat to reach the dose studied.
Yes, but the leading cause of death is arteriosclerosis, which is calcification of the arterial lining. If Vit-D is a contributing factor, and its already the leading cause, perhaps Vit-D balance is, or could be a problem.
Sure, this could well be the case. However, the study we are commenting on doesn't seem to be strong evidence, just like a study measuring the LD50 of water is not that useful when trying to decide how much water to drink daily.
"because they confused correlation and causation (sicker people have less vitamin D)."
I think it's weird how little this kind of context is discussed in regards to Vitamin D. Shouldn't all Vit D studies be controlling for CRP or some other marker for inflammation? Also, what about the risk of arterial calcification if Vit K2 levels are too low? I think we could even give Vit K2 a lot of the credit for increasing bone strength currently given to vit D. As some have remarked, a piece of chalk will appear bright white if you're just looking at bone mineralization. But it's easy to break.
I admit to not having checked in on this topic in about a decade, but the calculations still just seem a little... acontextual.
Also, I've sometimes wondered, given that statin drugs reduce cholesterol and vitamin D is produced from cholesterol, what portion of the benefit of statin drugs is related to its reduction in vitamin D levels in individuals with chronic inflammation. Perhaps that's an errant thought, I admit, but I wonder.
> The nationwide average is about 27 mg/nl, but black people (whose dark skin blocks sunlight) are almost all insufficient and bring down the average; for whites, it’s about 30 ng/ml.
Are American blacks almost all at insufficient levels of vitamin D, or do they just require less of it?
> a typical model of “deficiency” (technically insufficiency, you’re not supposed to use the word “deficient” unless there are observable health consequences)
Wait, so we're *stipulating* that there are no negative effects, but we're calling them "insufficient" anyway, just for fun?
I think the first (by usual standards of insufficiency) but there's not great evidence that they're really suffering because of it. Some people attribute higher levels of schizophrenia in this population to Vitamin D deficiency, but I think it's a stretch. Probably a non-psychiatrist would know of some other problems it's causing them.
> I think the first (by usual standards of insufficiency) but there's not great evidence that they're really suffering because of it.
I don't get it. I ask whether clinical "insufficiency" has anything to do with "insufficiency" in the ordinary English meaning of the word, and you respond by repeating that they really really are clinically "insufficient"?
If the African-American level of vitamin D isn't causing any problems, why are we calling it "insufficient"? What is it insufficient for?
I don't think there's any evidence that American blacks require any less Vitamin D than anyone else, so if the usual standards for Vitamin D are right, they are probably suffering from not having enough of it. I don't know enough about calcium metabolism to know the exact way they are suffering, but I would expect it to involve at least higher chance of bone problems and maybe other things too.
> I don't think there's any evidence that American blacks require any less Vitamin D than anyone else, so if the usual standards for Vitamin D are right
But I think this is a shocking position to default to. The evidence that American blacks require less vitamin D (in terms of bloodstream concentration) than American whites do is precisely the fact that they systematically have less of it. You begin from the assumption that people's bodies are working correctly, not from the assumptions that (1) everyone's bodies are meant to function in exactly the same way; and (2) we have perfect knowledge of what that way is. We know that both of those assumptions are false!
Why aren't we saying that Asian American men have chronically excessive levels of testosterone? Why aren't we saying that *blacks* have chronically excessive levels of testosterone?
Black people are native to tropical climates (in particular, sub-saharan Africa); their dark skin causing vitamin D deficiency further north is quite likely, given that one of the theories of why light skin evolved to begin with was to deal with vitamin D deficiency.
Not to mention the fact that black people do, in fact, have worse health outcomes overall.
No it's not. We're talking central biochemistry, the kind of stuff that is found essentially unchanged as far back in our evolutionary tree as our rodent ancestors, whereas skin color is a very late diversity in our family tree. The reasonable default position is to assume that central biochemistry works the same in all races, absent some evidence that for some odd reason a very late change in a small set of genes (coding for skin color) has also changed some core metabolic function.
You'd have a much better argument if the issue was a distinction between sex, since sexual dimorphism goes back hundreds of milions of years, and it *is* reasonable to assume that even core biochemistry may differ somewhat between the sexes.
>> Why aren't we saying that Asian American men have chronically excessive levels of testosterone? Why aren't we saying that *blacks* have chronically excessive levels of testosterone?
This is not a difference between sexes; it's a difference in core biochemistry between males of different races. And that's the norm; we expect noticeable differences in *everything*.
> The reasonable default position is to assume that central biochemistry works the same in all races
This is absolutely not a reasonable default position. Every time we look for racial differences in any aspect of human biology, we find them. This "default position" has been falsified _every time_ it's been investigated.
I've always had a similar theory regarding African Americans and hypertension. When 80-90% of otherwise perfectly healthy black people all have "high" blood pressure, maybe it's not actually high.
Only 41% of black people in the US have hypertension, and given that a lot of black people are fat and end up dying from cardiac issues, it seems unlikely that this is some "they have a different range" thing and more of a "they're fat and die of heart disease" thing. They do have reduced life expectancy.
Moreover, the rate of hypertension in blacks in the US is higher than it is elsewhere, again suggesting it isn't genetic but related to obesity/lifestyle. Outside of the US, black people don't actually have particularly high blood pressure.
I'd recommend looking into this further. There's quite a lot of recent research on Vitamin D levels in Black Americans, and much of it points to inter-racial differences in what constitutes deficiency. For example, Black people have lower total (many studies only measure serum concentrations which is much easier but also less accurate) 25(OH)D, and much lower levels of Vitamin-D binding protein. 80% (!) of variation in the latter was explained by genetic polymorphisms. Lower VDBP significantly increases the bioavailable fraction of total 25(OH)D.
Now, the research is definitely not unequivocal on this, and there are studies suggesting that Vitamin D deficiency is widespread in Black mothers and this explains some of the disparity in pregnancy outcomes. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3222336/
But what it does show is that we probably can't measure Vitamin D deficiency and insufficiency in the same way in Black populations, which could certainly impact the accuracy and interpretation of results from several of the studies in this post, including the Hadza data.
On an evolutionary level, I would expect that a reduced need for vitamin D would be more likely to evolve in the group that had sustained low vitamin D levels for long enough to change the color of their skin to try to make up for it. People whose skin is still dark are likely to have bodies that are adapted to higher levels of vitamin D than people whose skin has lightened to try to make up for ancestral deficiency.
Which brings up an interesting question for Scott. If Vitamin D is really just a boring bone chemical, the increased risk of broken bones is what drove the evolution of pale skin in northern latitudes? Do we accept that this is a thing which could happen or are we going to reevaluate our reckoning of where pale skin comes from.?
Your wife's a great medical communicator, I think her take on vitamin D is far superior to Scott's here. Please do let her know that her summary of the VITAL trial as finding "No benefit" is now false, as secondary analysis has found a significant 22% reduction in autoimmune disease incidence over 5 years: https://www.bmj.com/content/376/bmj-2021-066452
It's a good take on side effects of extra vitamin D. Biased due to the author's specialty, but still a good reminder that yes, there are side effects.
The big update here is that before recommending people larger doses, one should ask them if they had their levels checked recently, in the off case of already being high. But like Scott said, chances are pretty small.
Thanks for sharing. The biggest news for me there is that she sees that people with over 70 vit D levels often do have toxicity. She suggests 30-60 is the real range to aim for.
My memory is that the first time I thought vitamin D might be useful for covid was when I read articles pretty early on in the pandemic that claimed that African-Americans and South-Asian-Americans had the worst results in America, but Africans and South Asians were actually doing pretty well. There are other things that could explain this (mostly that countries near the equator are not logging all the data on deaths), but dark-skinned people in higher latitudes having the worst outcomes seems to fit the vitamin D story pretty well, and makes my prior higher than yours that vitamin D does something for Covid.
I recall hearing such studies about South Asians in the UK. But India seems to have been hit significantly harder by COVID than Africa (perhaps that's a bias in what manages to get reported in the US).
Yeah that’s fair. I’m thinking back to articles that came out before the delta wave that really hit India hard, but the hypothesis is definitely weakened by how India has done since then.
> Some people originally thought Vitamin D did all those things. They mostly thought this on the basis of studies, done at low doses, which found that it did. Those studies mostly found that it did because they confused correlation and causation (sicker people have less vitamin D). Then we did better studies (still at low doses) which found that none of those things were true after all, at least at the low doses which the studies investigated.
> If we then say “Yeah, but it could still be true at higher doses”, we’re missing the point. Now that our original reason for thinking it’s true is no longer valid, we should be back to our prior for any given random chemical, like hydroxymethylbilane.
Among other things, Andrew Gelman occasionally writes about the problem in science publishing of chronological non-independence. He points out that in reality, the truth is not affected by whether you do study A first and follow it up with study B, or whether you begin with study B and follow it up with study A.
The problem is that the norm in publishing is that someone does a low-quality, low-information study that is so bad that it can produce a spurious finding of statistical significance. (This is easier for bad studies to accomplish than it is for good ones. Incentives!) That study has a surprising, interesting, and "statistically significant" result, so it gets published. It is now Official Science, because it's published.
Someone else will then do a better study finding no significant effect. This is also easy to do, because the effect the first study found was an artifact of chance and low quality. And because the first study was published, the second one can be too. But nobody is willing to admit that the second study removes every reason to give any credence at all to the first study. The effect in the first study is real, by definition, because it's published. So the second study is always taken as refining the result, not as rejecting it. You can't reject an effect that is really there. Perhaps the effect is only present when the researcher's last name starts with P.
Gelman writes that this is ridiculous because -- among several other reasons -- if the studies had been done in reverse order, everyone would come to the opposite conclusion, rejecting the idea that the effect was real under any circumstances.
The attitude was that is an obvious thing to try, that it was easily and safely testable, and that it could help people. It looks like a civilizational failure that no one has tried it yet.
I think that the solution to the riddle is that most of the people trying to treat SAD are doctors and that doctors have justifiably strong priors against dramatically increasing the dosage of anything. "What if we tried more power?" is a refrain for scientists & engineers, not for doctors. Even the safest drug might do nasty things if used in high enough doses. Even water has an LD50.
The particular example of treating SAD with more light is distinct enough from the things that doctors build their intuitions on, that we shouldn't be surprised when they miss something.
> yes, this is the second scientist in this essay studying sun exposure with “Lux” in their name
Unrelated to Vitamin D, but related to light: One ought to do RCT on nominative determinism in general and light-related names in particular. During my Masters I read a seminal paper on light coming from black holes, by Jean-Pierre Luminet (https://en.wikipedia.org/wiki/Jean-Pierre_Luminet).
So, I might be overly emphasizing individual experiences, but I am trying to figure out how a modern urban/suburban office working human is getting anywhere enough sunlight to approximate ancestor sun exposure, even assuming damp little northern island ancestors who wore hats and bonnets a lot.
(I am also remembering time in Korea and other Asian countries, where middle class office working gals walked outside with a purse on their shoulder to keep their faces shaded, at least until they were married, as tans were seen as quite low class and not pretty. So maybe my perception of ancestor exposure is not entirely accurate.)
To me, assuming that diet replacement is sufficient is...assuming a lot? Even reading Scott I am not convinced otherwise.
I think this just something (like many others) that is tricky to figure out exactly.
Just because our ancestors adapted to, and managed to survive and reproduce living in, some particular conditions doesn't mean that those conditions were somehow _exactly_ perfect for us.
It's _possible_ that we just 'put up with' a lot of 'abuse' from the environment. (And it's also possible that that abuse was something like 'useful stress' too!) And given that everything is entangled-with/correlated-to almost everything else, it's possible that it's just not practical to ever get a clear answer as to what the 'perfect' dose is for anyone, either in groups or in particular.
Sitting in the morning sun on my east-facing balcony is my favorite source of vitamin D. It's best applied with dark coffee and a sativa/Indica blend. As I haven't got covid-36 yet, that makes the process postmodern-clinical proof the process cures cancer.
The reason for the persistent association of ill health and low vitamin D is most likely confounding. Deficient animal meat intake and low sunlight exposure are independently associated with unfavourable outcomes, and vitamin D mostly serves as a marker for both.
But since there is an anti-meat campaign (supported by all kinds of special interests) on the way, and an anti-sunlight campaign (supported by mostly the same special interests) as well, none of that will get rectified in the near future, so we can enjoy tedious debates such as this one, whether a boring bone hormone somehow will prevent you from dying.
The vitamin D supplementation bandwagon has been going on long enough, can we please put it to sleep?
I'm giving this comment a minor warning (25% of the way to a ban) - it seems kind of contemptuous, paranoid, and doesn't really explain itself. I feel more comfortable doing this since it's agreeing with me - a lot of people who disagree with me are worse but I have to worry about real or perceived bias.
My bad. Long form (barring the first paragraph, which I think is adequate):
Caution against sunlight exposure has arisen from an epidemiological association of sunlight exposure and sunburn with melanoma. This association was never really strong, though (RR 2.0), and did not appear in all studies. It was largely driven by reactive physician anxiety resulting from an epidemic of skin cancer overdiagnosis (a summary: https://www.nejm.org/doi/full/10.1056/NEJMsb2019760). It also has serious holes: logic would dictate that most melanoma should appear on the face and hands since exposure is high there due to a lack of clothing, yet this is not the case (more summary of contradictions in this body of work: https://www.bmj.com/content/343/bmj.d5477). Why did we go this direction? Nobody really knows, it appears to be the usual confluence of faddism and groupthink in academic medicine. There is an elephant in the room though: the sunscreen industry. Increasingly aggressively high SPF lotions (SPF 50 to 100 being unheard of even fifteen years ago) are being marketed to everyone, virtually guaranteeing that nobody develops any tolerance to UV light the natural way anymore and keeps buying more sunscreen. Sunscreen is now sold to black people, who don't need it, but marketing does the trick I guess (https://www.outsideonline.com/health/wellness/sunscreen-sun-exposure-skin-cancer-science/ this is a shoddy article and probably conflicted in interest but the interviews are gold). All the while another association has popped up: sunlight exposure and longevity/some markers of general health (https://pubmed.ncbi.nlm.nih.gov/24697969/). This keeps being ignored, mostly, and it will likely take years before anyone does conclusive interventional studies on this, since all IRBs will run amok due to being biased (UV is far too dangerous to do an interventional trial, yada yada). Exemplarily, a confluence of a mediocre academic belief with industry interest results in a deadlock which potentially jeopardises people's health, yet is guaranteed to persist without any challenge being possible at all.
The same thing with meat consumption: academics championed the idea that meat consumption is responsible for coronary artery disease based on legendarily bad evidence in the 1960ies. This immediately was supported by the food industry, who was busy hiding the fact that they were killing people with sugar (check Cristin E. Kearns' work for the full monty https://sugarscience.ucsf.edu/cristin-kearns.html) and eagerly jumped on the fat-heart-disease bandwagon. The evidence is decidedly mediocre (https://pubmed.ncbi.nlm.nih.gov/30958719/ and https://bmcmedicine.biomedcentral.com/articles/10.1186/1741-7015-11-63 and a few hundred studies just like this). Sometimes, the association appears rely entirely on processed meat, which like all processed food is associated with poverty and thus likely just another measurement for low social economic status (I don't have the space to delineate the entire controversy here).
There is also an elephant in the room, though: the food industry is not particularly fond of meat. It is a low tech product (a gun and some knives are all that is needed for production), the raw material (raised and finished animals) are expensive as hell, it needs manual labor to boot and production of high-value meat is decentralised in mom and pop specialty butcheries. The only money to be made in meat is by exploiting immigrant laborers in giant slaughtering and packaging plants with extremely poor working conditions, frequently aided by a currency exchange gradient (remember the COVID disaster in the meat plants? This is how we got there).
The food industry would more happily sell vegan meat substitute, which is cheap to make from cheaply farmed cash crops, even more cheaply produced in a highly industrialised factory and sold with a gargantuan markup.
Again, a confluence of rather poor academic beliefs with political and industrial interests created a deadlock that potentially threatens population health, bit will hold for the foreseeable future.
> Evidence is now trickling in that veganism is associated with all kinds of disease
The two papers you linked seem to me extremely weak evidence for this. One was about fractures, and I would guess is explained by vitamin d (maybe take some supplement…). The other is about mental health, showing raised depression scores and lowered anxiety scores amongst vegans. I’m sceptical; why lower anxiety? Also, no apparent attempt to separate correlation from causation (though I’ve only read the first page, but this seems like a crucial point).
I am mystified as to the anxiety scores, but this is likely just decreased agitation from the lethargy usually experienced on vegan diets.
Correlation vs causation - this is correct, but unfortunately all we have in nutrition is low-quality non-evidence from correlative studies.
I am more than willing, however, to assume that there is a real detriment to vegan diets, since this has been observed in clinical practice for at least a decade: predominantly young women with all sorts of psychiatric and functional disease and nothing wrong except a vegan diet - usually this quickly reverses upon a few weeks of normal meals. The few people I personally have seen trying vegan diets found them intolerable (rather quickly, the only source of nutrition becomes starch and sugar and processed vegetable oils). I realise this is inadequate for a blog focusing on rationalist ideas, but when there is no real evidence to go around, this is what I am left with.
It would be interesting to see an interventional trial, but as I mentioned, nutrition as a research field is not bothering with those anymore (probably because of a lack of success in proving the fat-heart-disease connection - see the Minnesota coronary experiment and the Sydney diet heart study).
The bone thing is complicated: most strength in our bones comes from collagen, which deteriorates when inadequate amounts of animal protein is ingested. The calcium appears to only contribute a minor amount (hence the wide range of bone mineralisation compatible with life). Our body is rather wasteful with bone calcium, too: in a bind bones are readily demineralised to raise the blood calcium level, and this takes weeks to reverse. I largely think the expedience of measuring bone calcium by X-ray transmittance (DEXA) has unduly focused research on bone mineralisation instead of bone collagen structure, which appears to be more important in preventing fractures anyway, since it is the only thing in bones resistant to shear stress (apatite mostly just resists compression loading). There is no real way of imaging or otherwise analysing bone collagen short of biopsy and staining with tedious image quantification, so if never became of clinical interest.
Edit/addition: in the elderly, who already have demineralised bones, D3 is widely prescribed along with calcium gluconate or calcium carbonate to strengthen bones. As far as I can tell, it is rather useless at that (noone really recovers from osteoporosis, maybe the decline in bone density is slowed a little bit, but the downhill momentum isn't appreciably changed). To me this suggests that (similarlily to sarcopenia, which is not caused by a lack of exercise but by a lack of dietary protein) osteoporosis is largely the consequence of protein malnutrition).
> Correlation vs causation - this is correct, but unfortunately all we have in nutrition is low-quality non-evidence from correlative studies.
this is not true? there are many rcts on veganism and vegetarianism and cardiovascular disease (veganism seems better than the typical western diet, maybe on par with most other diets), weight reduction (vegan diets seem better than typical western diets) and bone health (vegan diets carry some bone health risks, but this is due to vegans more often lacking calcium or vitamin d, and these can be supplemented) among other things.
- Yokoyama et al., "Vegetarian Diets and Blood Pressure"
- Lopez et al., "The Effect of Vegan Diets on Blood Pressure in Adults: A Meta-Analysis of Randomized Controlled Trials"
- Lee et al., "Effects of Vegetarian Diets on Blood Pressure Lowering: A Systematic Review with Meta-Analysis and Trial Sequential Analysis"
- Viguiliouk et al., "Effect of vegetarian dietary patterns on cardiometabolic risk factors in diabetes: A systematic review and meta-analysis of randomized controlled trials"
- Huang et al., "Vegetarian Diets and Weight Reduction: a Meta-Analysis of Randomized Controlled Trials"
> here is an anti-meat campaign (supported by all kinds of special interests) on the way
well, sure, though it's certainly less of a stretch to just chalk it up to legitimately held morals rather than special interests... I would expect Big Green Bean to lose to Big Animal Husbandry in a corporate proxy war
> and an anti-sunlight campaign (supported by mostly the same special interests) as well
huh? an anti-sunlight campaign? As Scott said, you can't just throw something like that out without explaining yourself
I found it pretty easy to grasp. If you have kids your pediatrician will warn you against sunlight as if they were little vampires, telling you terrible things will happen if you don't slather their delicate little skins with SPF 50 and put on a hat and sunglasses every time you go outside. It can be pretty strong stuff. Maybe they've backed off on this a little since my kids were young, though, which was admittedly 25 years ago or so.
Hmm, thanks for the explanation. Though as a 24 year-old I can't say I ever heard my pediatrician dwell on that. Obviously if you're a pale Irish heritage laddie like me and go to the beach and don't put on sunscreen, you're in for some brutal sunburn, higher risk of skin cancer, and leathery skin once you're 60.
I am always skeptical of phrases like "higher risk", but given that 4 of my parents and grandparents have have to get Mohs procedures done, and my grandparents more invasive surgery, we don't seem to be talking about "doubling your chance of getting hit by lightning" here.
Seems to me like the society I know at least is pretty well calibrated against sunlight exposure risk, but it's good to hear one testimonial to the contrary.
It may be the pendulum has swung back. When I was a kid, in the late Pleistocene, it was considered quite healthy for kids to get plenty of sunshine. A deep suntan was kind of expected in the summer months, and all of us got a sunburn or two in the late spring/early summer, each year, if we weren't sufficiently careful, which we often weren't. Our mothers shrugged with general indifference, figuring the discomfort of itchy peeling skin was enough to teach us to be moderate. But nobody really thought of sunscreen unless you were going to the beach or Florida and expected to be out *all day*. For just mowing the lawn or playing a game of ball? Nah.
When my kids were born, it had swung hard the other direction, and we were advised to avoid suntans, avoid the sun in general, until they were at least in double digit years, and allowing a sunburn would be like mild child abuse. It was a rather surprising shift from what I'd experienced as a kid -- but then, you always figure as a new parent that *your* generation is going to be The Best Parents Ever so I didn't really question the received wisdom.
My impression is that people are more moderate now. People who are unusually sensitive (such as yourself) are advised to take due care, and I don't think anyone advises a sunburn, but even my dermatologist thinks it's a good goal for kids to get plenty of sunshine in the summer, enough to form a moderate tan.
Legitimately held morals seem not to apply when judging agriculture, which relies on the large-scale destruction of habitat and the extermination (by means of ploughing, pesticides and harvesting) of thousands of deer, rabbits, mice, rats and all sorts of birds and millions, if not billions of insects per square mile of farmed land.
I understand why people care less about insects than other animals, but hopefully noone tries to argue that the life of a cow shot dead for the steak is worth less that the life of a stag shredded in a combine.
Likely, the most ethical way of retrieving nutritionally adequate food (in the sense of killing fewest animals per calorie) is hunting game and herding cattle, sheep and goats, provided their fat is used and not burned, as currently. However, since this intersects with the anti-fat bandwagon of 1960ies academic medicine, there is a nice impasse reached again...
You seem to be implicitly making the argument that plant agriculture is as deadly/harmful to animals as animal agriculture. This is something that people really really want to be true, but definitely isn't- if nothing else, because more crop production is for meat than is for human crops.
I’d be interested in your views on this Medcram video. The argument it puts is that people with high vitamin D do better against Covid but that vitamin D doesn’t protect against Covid. That’s because the the thing that actually does protect against Covid is produced by the thing that produces vitamin D. That thing being sunlight. I think his hypothesis is melatonin but the take away for me was to make sure to get outside as often as you can which I already do! https://youtu.be/9eEyWlbToI4
Going a bit further, the potential chain of co-founders is almost endless. People who get outside more tend to get more exercise, be less overweight, eat healthier diets, have better access to medical care, a more supportive group of friends and family, anything else you might come up with.
True. But he does have a hypothesis for why melatonin might be the answer. (At least I think it was melatonin - it’s a while since I watched) Something to do with Ace2 receptors or something equally scientific!
I think a better way of expressing this idea is "circular locating the hypotheses".
There are countless chemicals and microelements that could, in theory, be supplemented with good results. The difficulty is primarily in focusing on the right ones - the actual checking is less bits of information than the focusing.
For various reasons we focused on Vitamin D. Probably because it really helps in a minority of cases, and also because we had a plausible story - less light exposure due to staying indoors and dressed. But once we focused on it, the checking is done and we didn't find anything.
What we're doing now is looking at the vast amount of possible chemicals and saying "hmm, that Vitamin D looks interesting, look at all that fuss around it. I wonder if it's good? Let's check". That's a feedback loop - more you check, more "interesting" it looks, regardless of actual results. The only way to break the loop is to look at the evidence, accept (or not, as the case may be) that it's been studied enough, and move on.
Having lived in the US and Europe, one thing that I noticed was how much less sunlight you get in Europe, at least seemingly. A huge fraction of the US population have ancestors who lived much further north. My ancestors came from the British Isles, Germany, and Scandinavia (AFAIK). Scotland is much further north than Indianapolis, where I group up. Indianapolis is level with Rome. It's quite plausible that, even while being inside for more hours, a person whose ancestors lived in northern europe would be exposed to roughly as much sunlight in a place like Indianapolis. When I moved from California to Moscow, Russia. So, when there, I supplemented with vitamin D for six months in winter. As an academic, at some point I went searching for high-quality RCTs which showed vitamin D supplements boosted immunity. I probably overlooked a bunch of studies, but at the time I couldn't really find anything too impressive. Nevertheless, b/c of the online hype, and total lack of sunlight in the northern European winter, I'd supplement with vitamin D daily. In summer I try to get outside for at least 30 minutes a day and be active.
I can see the attraction of vitamin D pills. Who doesn't want to believe in a magic pill that makes you healthier at almost no cost? To my reading, that "pill" is to get exercise, eat a balanced diet with vegetables, fruits, seafood and meat, and don't do things that interfere with your sleep schedule. Go light on alcohol and don't smoke. Do some kind of resistance training.
Maybe I'm wrong here though, and I should be supplementing with something.
Another question I had: is it obvious that taking a supplement pill would be the same as getting more sunlight? I don't understand the physiology at all. But it seems plausible to me that getting vitamin d from natural sunlight might do more than taking a vitamin d pill.
A related, but slightly different point is that it's healthier to be outside than in a tight, confined space, potentially inhaling viruses from other people. And it's also healthier to be active. Vitamin D can be a proxy for both of these things.
The other thing to talk about here is skin cancer. In Australia, which has a huge population of people whose ancestors lived in Britain, skin cancer is a huge problem.
One worry I have with vitamin D supplementation is... If your body gets used to you taking a supplement, maybe your body will react by synthesizing less of it? so, if you start the supplements, you'll have to take them forever, or have some negative effect when you stop?
The active hormone form of Vitamin D is two reactions down from D3. Your body cares about the concentration of C, it produces C by A --> B and B -->C, with any excess of A dealt with by the kidneys like anything else. The diet/exposure/supplement goal is making sure you have sufficient A to feed the two reactions. You probably aren't affecting the regulation of the metabolite reactions by getting more A from a supplement.
I think a useful comparison here is the Inuit. They live further north than almost anyone in Europe, and have noticeably darker skin.
The story I heard, somewhere on the Internet, was that the main sources of vitamin D for humans are sunlight and meat, and lowered melanin (lighter skin, and maybe some hair effects) evolves when there's a population in higher latitudes that almost entirely eats plants. Historically, this pretty much meant settled peasant farmers, the main examples being Europe (especially around the Baltic Sea) and parts of China. Populations in higher latitudes that eat a lot of meat, don't *need* to evolve lower melanin levels, but there's no evolutionary pressure to have high levels either, so it's subject to drift.
Not just the Inuit. The Mongols. They lived at similar latitudes to Russians or Germans. Only they considered semi-starved agricultural laborers in China a crime against humanity and put their fields to pasture.
In 1972 my physical anthropology prof flatly stated that darker skin selected against skin cancer and lighter against rickets—and that explains the relationship between skin color and distance from the equator—particularly in cloudy Europe. I don’t see rickets mentioned.
Maybe I'm a dopey layman, but didn't hunter-gatherers in temperate climates wear shirts? I'll go so far as to suggest that they didn't wear Abercrombie & Fitch [citation needed], but the things that live out in the sun, like flies, brambles, and pole cats, all can mess up your skin and introduce disease. That's not a modern realization.
"Surprisingly, average American levels seem about as high. The nationwide average is about 27 mg/nl, but black people (whose dark skin blocks sunlight) are almost all insufficient and bring down the average; for whites, it’s about 30 ng/ml. Why are these levels as high as some of the farmer-specific studies elsewhere? Maybe it’s Americans’ better nutrition - or maybe it’s that lots of Americans already take Vitamin D supplements. Canadians are close behind at 26 ng/ml; they fail to break this down by season but I’m guessing it was in the summer."
Because all of our milk is Vitamin D fortified, I strongly suspect it's the milk keeping most americans relatively high. On top of the reduced effect of sun, blacks are much more likely to be lactose intolerant. It would be interesting to see numbers separated by latitude and milk consumption.
I do wonder if the effects of vitamin D are secondary effects. If you have issues caused by D deficiency, fixing that allows your body to repurpose other resources towards other issues. Kind of like the anti-parasite drug that seemed to help against covid... in areas which had a higher prevalence of parasites.
Personally I take 10,000 IUs, and it seems to help both me, and everybody I have recommended it to.
However - I, and everybody I have recommended it to, also happen to be lactose intolerant office workers who get very little sunlight. Social bubbles, I guess? Dunno why lactose intolerance is so insanely common in my social bubbles, though.
Additionally, the observation pattern is not "This helps me", it's "It's not the vitamin D, I'm just having a better week than normal. Oh. It happened again. Oh god. Oh god I'm a bag of chemicals."
My guess would be that 10,000 IUs is too low to be directly harmful, while also being sufficiently high to quickly resolve long-term severe deficiencies in a timeframe short enough to be personally observable.
Given social bubbles, I don't think "General population who is at normal or slightly insufficient levels" is the correct reference class. Nor should vitamin D be considered a "normal" vitamin when considering deficiencies; the cluster of people who benefit from it are going to benefit from a significantly higher dosage than you'd expect looking at the average population.
I think this is misleading. Most children aren't lactose tolerant by and large, they become slightly lactose intolerant as adults because they stop ingesting dairy regularly. Adults that continued regular ingestion of dairy mostly don't have issues.
Similar patterns happen with other adaptations too. If you go on a keto diet for a long time and then start eating carbs again, you'll have serious bloating and gastrointestinal issues for awhile too.
We've identified the location of the gene that is responsible, and found mutations associated with lactase persistence. It is genetic, and primarily so; some people do maintain the ability to digest lactase without the mutation, but most do not.
Lactose intolerance is uncommon amongst white people, but in every other racial group it is the norm. If you work with a lot of Asians, black people, or Native Americans, you're likely to know a lot of lactose intolerant people.
Glad Scott mentioned obesity, because a lot of the discussion of Vitamin D I've seen omits what seems to be important differences in vitamin absorption in obese people. It's a harder topic to talk about due to PC, but IMO should be noted whenever statistics about what percentage of Americans are Vitamin D deficient are thrown around. People who at a healthy weight are at a lower risk of deficiency than the general population and should supplement less. (Although de facto, just as vaccinated people are more likely to wear masks, I expect that non-obese people are more likely to take Vitamin D.)
(Also—if you're supplementing D because you get outside very rarely, consider changing that?)
Having seen crippling adult asthma mostly cured through high dose Vitamin D, I will persist in believing that there is something besides calcium regulation going on.
As for what constitutes Paleo Vitamin D levels, look at some National Geographic magazines from a century ago. Many warm weather people wear considerably more clothes these days. Conversely, Americans wear considerably less than they did a century ago.
wow: as Biologist (me) instead of focusing on evolutionary "theory" and Vitamin D dosing it would be more appropriate to look up modern data which is available on our (humans) ability to metabolize and use Vitamin D on a daily basis. Vitamin K is in its own category because of its relationship to blood coagulation, so that should be medically evaluated. Vitamin C has many studies, along with Zinc which show that it does boost immune responses. People think that Vitamin dosing is a benign response but I have personally known people who had some bipolar issues and other symptoms who overdosed on Vitamins giving them diarrhea, cramps, etc. A lot of clinical data on modern daily doses of all Vitamins is available. Still a good article. :) lots of work too. more than what I would do :) have a great day! '-
Chemicals on the giant chart of metabolic pathways are kind of like religions. For any chemical with a large enough biological role, there is going to be some group of people exhibiting cult-like behavior, who are absolutely convinced that it is the One True Chemical that is the ultimate solution to all disease.
Frustratingly, some of these groups are probably correct, in the same way that on the day the world goes up in flames in a nuclear apocalypse, there is bound to be some conspiracy theorist somewhere in the world who predicted it correctly.
Hehe, I feel called out, given I say I belong to the Church of Cobalamin. :) (That's vitamin B12.)
I do at least try to curb my enthusiasm, and/or plaster it with disclaimers that This One Weird Trick may not work for you at all. But there's something nice about being able to recommend people just give it a shot because it's over-the-counter and it's one of the few vitamins where overdosing isn't really an issue. "Are you having trouble sleeping? Inexplicably sensitive to sound and light? Depressed? Troubles thinking straight if someone talks nearby? Try vitamin B12 t o d a y!" It's had a ~20% success rate so far, which is good enough that I intend to keep recommending it (if the symptoms I expect to see match enough). But it is *definitely* my hammer where lots of things suddenly look like nails! ;) Guilty as charged.
(I don't think it's gonna cure COVID-19, though, just to be clear, nor any other pathogen-caused disease.)
I do think that I'm sure I spend less time outside than my father (who grew up on a farm) and he was outside less than his father (lived on farm his entire life) and him less than every other ancestor before him (worked on farms using only horse power). In times before sunblock or sunglasses were invented.
Does this mean my vitamin D levels are lower than any ancestor? Is this why I'm so near-sighted?
"Hunter-gatherers in the environment where most of our evolution happened might have been outside all day shirtless. On average the sun's halfway from peak, so that might be equivalent to 8 hours of peak sunlight at the equator."
Okay, from the start I am going to dispute this. We've been told hunter-gatherers had this idyllic ancestral lifestyle where they got all their needs met in a few hours and had the rest of the day for leisure time. They weren't forced to be out toiling in the fields under the blazing sun for hours every day.
If they're walking around shirtless under the equatorial sun for eight hours a day, then they are stupider than lions:
"Lions are most active during dawn, dusk and periodically throughout the night. During daylight hours they can be found lounging or sleeping in the shade."
If our shirtless hunter-gatherer can't figure out "get under a bush, stay in the shade, don't move around too much, and drink water", then he has more problems going on than "boy, I must be generating *so* much Vitamin D right now!" can solve.
After all it's just mad dogs and Englishmen who go out in the mid-day sun:
me and everyone in my family is vitamin D deficient in the winter (dark skin + live in UK), and the main complaint we had that got fixed by taking vitamin D was extreme fatigue. We went to the doctor about tiredness, got blood tests and then got given the 20,000 IU or 50,000IU pills. At no point did anything about bones come up when discussing it with the doctor for me (can't speak about everyone else).
So many paragraphs on D without hitting the most important points. You hold up the IOM (now renamed the NAM) as more prestigious than the Endocrine Society despite their statistical mistake and the fact that the Endocrine society has 18,000 members (in "medicine, molecular and cellular biology, biochemistry, physiology, genetics, immunology, education, industry, and allied health" according to Wikipedia). Okay. The IOM's own minimum serum level recommendation for 25(OH)D is 20ng/ml. This is also the target min set by the European Food Safety Authority, Germany, Austria, Switzerland, Nordic Countries, Australia, & New Zealand. It was also the consensus rec of 11 international orgs (see https://academic.oup.com/jcem/article/101/2/394/2810292?login=false). Despite this consensus roughly 50% of humans globally do not achieve this level. Calculations & citations for the 3-4 papers involved can be found at: https://twitter.com/KarlPfleger/status/1390775110257102848 (I welcome corrections). Why don't you see this as a big problem? Even if only for global bone health. For the Endocrine Society's recommended 30ng/ml minimum, roughly 3/4 of people globally are too low. Note also that this higher 30ng/ml is the level at which typical blood tests from Quest or LabCorp in the US are flagged as low. What governments are making concerted efforts to drive down these deficiency rates? 25(OH)D tests are inexpensive & widely available. The normal standard is that RDAs are set so that 2.5% or fewer people are deficient. The US estimates typically come in at 25-35% or sometimes 40% at the 20ng/ml level. That's 10x the deficiency prevalence max target. What US government agency is responsible for reducing these deficiency numbers?
And while those deficiency %s were for populations as a whole, looking at racial breakdowns also compels action. As noted by Ames, Grant, & Willett in https://www.mdpi.com/2072-6643/13/2/499 dark skinned minorities have it much worse. In the US 75% of blacks are clinically deficient & 96% are below the 30ng/ml level recommended by the Endocrine Soc. It's not absolutely proven that this explains part of their worse health outcomes but it is clearly the conclusion of these authors (who are respectively pillars of nutrition research, vitamin D research, and clinical medical research) that it is an important factor, and certainly not proven that it isn't. This is clearly a racial issue. These are absurdly unacceptable deficiency prevalences. Why aren't public health groups & government agencies trying harder to help these groups?
You both downplay the importance of the observational studies too much. Much of science and many important clinical medical decisions are & should be based on observed correlations together with understanding of the basic underlying science (physics/chemistry/biology). Astronomy and other scientific fields make progress without the use of any RCTs. Seatbelts, parachutes, and smoking warnings on cigarettes are all justified without recourse to RCTs.
But now let's talk about Covid risk factors. Age, obesity/overweight, and comorbidities like diabetes are established Covid risk factors that no one questions. All of these are based exclusively on observational data. No RCTs are part of establishing these as legitimate risk factors. Vitamin D status, ie 25(OH)D, is also unquestionably at this point a statistically significant risk factor for Covid. 75+ studies with aggregate ~2M subjects. Multiple meta-analyses. Narrow confidence intervals. And effect size of 1.5-2x different risk, mostly based on the 30ng/ml threshold or the 20ng/ml threshold. (See https://twitter.com/KarlPfleger/status/1486565564671692804 for citations.) This is not just important for hypothesis generation for therapeutic potential, it's also important for stratifying the absolute risk level for groups or individuals which is how one clinical computes Number Needed to Treat (NNT) for helping to decide on relative benefit vs risk of potential interventions, such as vaccination or use of anti-viral drugs.
Meantime, no evidence suggests that being vitamin D deficient is protective for Covid. And known immune biology suggests multiple clear mechanisms of action by which D should be protective. (See eg https://asbmr.onlinelibrary.wiley.com/doi/full/10.1002/jbm4.10405 but more mechanism papers in the previously linked Twitter thread). So, that leaves a pretty clear-cut benefit vs risk analysis:
If governments / public health officials emphasize reducing deficiency for pandemic control & end up being wrong about D helping Covid, the biggest side-effect would be reduction in the huge prevalence of deficiency (see other comment I just made here), w/ consequent improvements in population wide bone health, and probably autoimmune health, & resistance to other ARIs.
Or conversely, if officials recommend a concerted effort to reduce deficiency for the non-Covid benefits, reduced Covid transmission, hospital burden, & deaths are all plausible side effects even if not guaranteed. But a worse pandemic is not. I find your overall stance puzzling given the apparent imbalance when viewing things this way.
Lastly, (this is the 3rd of 3 top level comments I'm making here) let's talk about scientific hypotheses and the proper ways to test them. The massive observational data showing correlation between 25(OH)D and increased Covid risk suggests more than anything else a specific form of clinical intervention: Give deficient people vitamin D until they have enough (eg for example those starting at <20ng/ml until they have >= 30ng/ml). This is in fact exactly the form of intervention long advocated by vitamin D researchers such as Heaney: https://academic.oup.com/nutritionreviews/article/72/1/48/1933554 and Grant et al: https://www.sciencedirect.com/science/article/abs/pii/S0960076017302236?via%3Dihub
The intervention give a fixed amount of D to people for whom one hasn't tested baseline D levels, or the intervention to give a fixed amount to people after testing their levels but without testing to see that they have achieved sufficient levels before beginning to record adverse event differences vs control are not correct forms of study design to properly test the main hypothesis. And in fact these problems are the main reasons why many vitamin D RCTs fail. See for example https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7487184/
Given the massive observational data, the many clear mechanisms of action, and the fact that no pharma company will fund a study of an inexpensive supplement, it is puzzling that governments have not adequately funded a properly designed study to test the hypothesis properly. To this day there is no RCT testing the use of the intervention raise 25(OH)D from 20 to 30ng/ml powered to adequately rule of the hypothesis that it would reduce severe Covid outcomes like death & ICU, yet people (including this piece) keep (inappropriately in my opinion) casting doubt on that hypothesis despite compelling biology.
The observational data is solid. It doesn't prove causality, but it's suggestive of such a strong potential benefit, that it should be someone's responsibility to disprove the causality by finding the confounding shared causal variable that fully explains the causation. Until then it's reasonable to apply precautionary principle and try to ensure people are not deficient.
I remember people making similar arguments -- can't hurt! might help! precautionary principle -- about Vitamin E about 25 years ago.
Turns out, when the careful work was done[1], Vitamin E supplementation not only doesn't help in one of its target disorders (heart disease), it actually boosts the risk of heart failure. So significant supplementation does active and measureable harm. Oopsie!
This is a false analogy because the hypothesis about vitamin E back then was primarily that it just helped regardless of status, not that a huge proportion of society was deficient. The precautionary principle is much more appropriate to use in today's vitamin D case precisely because there is ~50% global deficiency (at the government set recommended level) and ~75% insufficiency at the Endocrine Society set min level, and the relevant hypothesis suggested by the massive observational data is that fixing the too-low levels is what will provide benefit. We know that giving more and more of most things doesn't continue to provide increasing benefit forever so it was not entirely reasonable to say that more E regardless of baseline was ideal use of precautionary principle. It's much harder to imagine that fixing the too-low D levels will cause harm than blindly giving more E. There's no known benefit to being vitamin D deficient, especially for Covid. So precautionary principle is much more justified in this case.
I remember reading an article back in the 1990s or early 2000s in Natural History (AMNH's member magazine) about vitamin D and folic acid. It pointed out that increased sunlight raises vitamin D levels but lowers folic acid levels, so there is a tradeoff between getting enough of each. This suggests that there is probably a limit on how much D a person can acquire from sun exposure without running into a folic acid shortage.
(The authors also argued that this accounts for women having slightly lighter skin than men as a result of their having a different balance point for the two chemicals. Is this even true? I have no idea. I can think of dozens of confounding factors, but that's a discussion for another time and place.)
Is it that hard to believe that Vitamin D is a boring bone and immune system chemical, as opposed to just a boring bone chemical?
The idea that it can "treat COVID, prevent cancer, prevent cardiovascular disease, and lower all-cause mortality" sounds ridiculous when described like that, largely because it implies that those are all separate things that Vitamin D just happens to fix. But really, it's all one simple (and thus not especially unlikely) effect: It improves the immune system's functioning, and anything that improves the immune system's functioning is going to make the body more capable of dealing with illnesses in general, including COVID, cancer, and probably some forms of heart disease (specifically, whichever types are caused by pathogens). Obviously it's not a miracle cure for any of those things - it's not guaranteed to prevent you from getting them, nor to make them go away once you have them. But it can make your body a little better at dealing with them.
"Improves the immune system" is just hiding the magic underneath another anodyne phrase. What do you mean, specifically, by "improve the immune system?" There are plenty of drugs one can imagine that interfere with this signalling pathway, or cofactors required for this particular cascade, et cetera -- but these are all highly specific causes and results. It would be unusual to find one compound that has such broad activity, across the hundreds to thousands of pathways involved in immune response, such that one can only really describe it by saying it improves the immune system.
That, I think, is the point. If one small molecule can have such very broad positive results, it follows, looking at it from the other direction, that this one molecule is single point of failure for an entire critical physiological system: not enough Vitamin D and *everything* goes to hell. That is not in general how physiology works (nor would it be a good engineering design for a robust system). Usually there are backups and secondary pathways and other ways of stuff getting done for almost any major physiological system. We can even do without O2 for very short periods.
That's not to say Vitamin D necessarily *doesn't* play as central and critical a role as would be required for it to have these very broad effects, but it's a proposition that one would quite reasonably view with skepticism, on the grounds that it seems dubious Mother Nature (or God the Creator if one prefers) would build a system with such inherent design fragility.
Granted, I'm not an immunologist, and I don't claim to understand the mechanisms through which a chemical might conceivably "improve" the immune system, or even the mechanisms through which the immune system works in the first place. For that matter, I don't really understand what mechanisms would cause a "boring bone chemical" to improve bone quality, or how a substance might conceivably improve circulatory health or digestive functioning or neurochemical balance or any number of other things that various substances are capable of doing. Is there any reason to think that it's *less* likely for a substance to be able to affect immune response than any other particular "system" in the body?
Not at all. I'm just saying (1) almost no normal nutrient has any significant effect unless you're suffering from serious deficiency, on account of the body normally has about eight different pathways for doing the same thing as backups to mild deficiency -- and, I mean, thank God, that's why we can take statins without killing ourselves, because although we interfere with one major pathway for building steroids (the one that leads to cholesterol), there are backups and workarounds to build those we can't possibly do without; and (2) if you are looking for *positive* effects (improvements above normal-baseline-healthy), you would expect those to be rather specific, what we expect from most therapeutic drugs, e.g. you wouldn't expect Celexa to improve your kidney and immune function as well as your mood, and you won't expect your BP meds to make you better able to focus and more resistant to winter flu.
But statins do end up having a major effect, don't they- so these natural redundancies don't end up replicating statins' effects. Which kind of refutes the objection, doesn't it?
> Our priors on a random chemical doing that have to be less than 1%, or we get caught in weird logical traps.
Maybe not all of those at once, but what are the chances that a chemical does any one of those? If it's less than 5%, then that means that, given some random chemical passes a well-done controlled randomized trial, it probably doesn't work.
On the importance of vitamin D: we dropped our skin pigmentation on our way out of Afrika in evolutionary record time, whatever the reason was it apparently changed either mortality or reproduction success dramatically. And it's not unreasonable to assume that this is vitamin D, or that vitamin D plays an important role in it. Not sure how this can be proven though, its just a hypothesis.
Regarding the minimal/optimal levels: i know the Hadza studies, but there are much more .. I remember a few from India and Italy that look at people working outside, with less or little clothing like farmers, construction workers, and they all come out at levels between 40 and 60ng/ml as the "normal/optimal" range. But of course there are plenty of studies pointing in the other direction considering 40ng/ml as way to high, the official recommendation here in germany was for ages "nobody needs any supplements, including vitamin d, you get all you need from a balanced diet!". Exception of the rule: babies in their first year get 400IU here because this dramatically lowers chances of rickets.
And this is the point which made me overthink my position a few years ago: the body of a pregnant or nursing mother makes sure the embryo/baby gets everything it needs, all the stores of the female body are depleted, if necessary even more. Except for vitamin D, which is often minimal in human milk. Unless, and this has been tested clinically, the vitamin d in the mother reaches around 40ng/ml 25(OH)D or above, then the milk contains enough vitamin, no supplementation necessary.
There were several studies looking at this, small/medium RCTs if i remember correctly, with supplementation in the range of 4000-6500IU daily, and beside the milk issue they found effects like 20% lower probability of birth complications and other things improving. I really should dig out these studies again .. I haven't bookmarked them.
The only one that made it into my "google keep" is this one: "New insights into the vitamin D requirements during pregnancy" https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5573964 which is an overview of the state of things during pregnancy, with over 140 references
PS: bear with me, english is not my first language ;-)
Adding more anecdotal evidence to the pile: I have fibromyalgia and have been dealing with flareups for approximately my whole life, without knowing what they were (I wasn't certain I was chronically ill until 2020). My worst episode was in mid-2019, where for nearly three months I was so fatigued that there was a noticeable drop in my work quality, and I had to start working from home because I couldn't make the ten minute walk to the train station. I was essentially bed-ridden for part of the time, and I had at least one day where I couldn't reliably speak or focus my eyes.
Doctors did a full blood panel, and the only unusual thing they found was that my vitamin D was at 24.8 ng/mL, considered insufficient but not deficient. I started taking 10,000 IU daily, and the months-long episode cleared up within a week or two. I've been taking 5,000 IU daily for the past two years and my rate of flareups — which previously happened every couple months, unpredictably — has dropped to basically zero. I also talked to a fibromyalgia specialist (who was maybe a bit of a quack, but ¯\_(ツ)_/¯) who said she recommends all her patients take 5,000-10,000 IU daily — not that I trust much of anything she said, but it did feel like it backed up the idea that I was not totally crazy to think that supplementing vitamin D was helping.
My current hypothesis or story of Vitamin D is this in short: Vitamin D helps controlling and adapting our metabolismisms to summer / winter cycles. Nearly every cell in our bodies has receptors for Vitamin D.
The farer a region is from the equator the less sun and energy reaches the ground during the winter and the less food is available for animals in the end. Therefore the evolution invented hybernation. This is a special 'low activity low power consumption mode which helps to survive hard, long Winters without food. In autumn many animals prepare for the coming winter by eating es much as they can and store a lot of additional energy as fat in their bodies. This fat is then slowly converted into energy during winter while many of these animals just sleep most of the time. Research about hybernation seems pretty new. Only recently researchers found hybernation mechanisms in animals which were not known for that. Mongolian wild horses which move way less and reduce their body temperature for example.
If I look at all the random facts I learned the past 20 years I see a pattern: I have more apatite in dark times, I get fat in autumn and winter, I eat less in Summer, I'm depressed when it's darker, I'm happy when it's light, in winter I move way less compared to summer, Vitamin D, SSRI, Serotonine, Melatonine, sleep, and mood/motivation/missing motivation are related to each other asf
Not sure , where I heard first about the Hybernation-Hypothesis first but I cannot un-learn it anymore.
TL;DR
If I look at all those strange things happening with my body though the 'Evolution-Hybernation-Lens' it seems to make a lot more sense for me.
Just remembering random facts I've collected over the years: D3 is animals, D2 is plants. Both need to adapt as good as they can and survive hard long Winters without enough food. Hybernation and low energy consumption in the live-and-death context means to shut down every non-essential function and only maintain the body over the next 5 to 8 months with the lowest possible energy consumption. Bone growth might be luxury, an overly active immune system might by a luxury, moving, exploring is definitely a luxury. I would consume the least amount of energy when I would sleep all the time.
Our son has celiac disease and has blood tests every 6 months. Just for fun, the gastro doctor said to throw in some extra tests last time around including Vitamin D. It came out very low. We were mortified, and pledged to start supplementing straight away. She told us we could supplement if we wanted, because it's practically free after all, but that every single child she had ever tested came out low, often even lower, and it was nothing to really worry about (incidentally, while our son's symptoms are pretty out there, she has seen it all, including paralysis, fits, and psychosis as a result of celiac).
Anyway, our son has a whole suite of symptoms that occur every time he has a a 'glutening' caused by putting his fingers in his mouth after touching a crumb on the bus, or the like. You can set your watch to it. This happens roughly every three weeks and while they are not life-threatening they are sufficiently serious - both physically and mentally - that we have organised our life for years around minimising them.
Two weeks after we started supplementing, a glutening occurred. All the same symptoms, but much milder. A month later, even milder. A month later, even milder. So mild that, if we weren't so attuned to them, we wouldn't even notice them as part of a pattern. After years of fending off doctors and teachers telling us he needed Ritalin - and making faces at each other when we said his mood swings and hyperactivity were a side-effect of exposure to gluten - we found the cure. Our family doctor - who by the standards of the profession is actually pretty good - made a vague effort to feign interest before telling us not to rule out Ritalin.
A general programme of promoting vitamin D supplementation would, at the bare minimum, save tens of thousands of children from dangerous medication and essentially abusive treatment regimens for hyperactivity. All for perhaps 0.00000000000000001% of the costs of Covid policy. Whatever the actual mechanisms involved (and I lean to a Eugyppius-Yarvin type model rather than the Alex Jones version), the medical industry is functionally a criminal entity. Analyses like this one that don't rest on an understanding of the fundamentally criminal nature of the medical industry in selecting and interpreting data are literally worse than useless.
Scott writes of his priors. Mine—however anecdotal—are as follows:
About a decade ago, having moved about a decade previous to that from southern Ontario to southern England, I had a very bad winter health-wise: instead of a handful of colds and perhaps one illness bad enough to raise a fever, I had four or five bouts of illness with fever and in between I would never fully recover, remaining congested with a persistent cough.
In the decade preceding that winter I had tried taking vitamin C supplements for immune health but had stopped because they did not have any discernible effect, but around this time I read a popular press article written by a prison doctor (probably here, though URL is defunct not archived by archive.org: https://www.medicalnewstoday.com/articles/51913) who had noted that vitamin D supplementation had dramatically improved the health of the prisoners in his care, so, living in northern Europe and spending most of my time indoors (and despite being quite fair-skinned) I figured it was worth a try.
For the next two winters I took around 2000 IU of vitamin D3 every day, and had nary a sniffle. Unfortunately, however it was difficult at the time to get even modestly-high-dose vitamin D tablets in the UK (the RDI here at the time was something laughable like 40 IU/day) and eventually my imported supply ran out, so the folowing winter I took none and was terribly sickly again—perhaps not as bad as that first terrible winter, but much worse than I had been the two intervening years.
Since then I have managed to keep myself supplied and have resumed enjoying generally good winter health. Perhaps it's just the placebo effect—though a placebo which proved highly effective against COVID too, as it happens—but good enough to be in the "definitely worth it" category for me.
As the Luxwolda 2012 paper shows, the semi-nomadic pastoral Maasai tribe have a mean average level of 48 ng/ml, with the largest subset of the group (40%) having a level between 48 and 60 ng/ml, and the third largest subset (~12%) having even higher levels between 60 and 70 ng/ml. Even including the hunter-gatherer Hadzabe tribe, the largest subset is still the 48-60 ng/ml group at 33.3%. So the median (which is a more representative value of the norm than the average) will be in this range, higher than the 44 ng/ml mean level stated in this blog post, and so higher vitamin D intake will be required to reach the median value.
Vitamin d seems to have some immunomodulatory effect that is beneficial for some autoimmune conditions like the ulcerative colitis I've lived with for the past 15 years. Plus some meta-analysis said it reduced colorectal cancer risk by half and calcium absorption in the colon is plausibly related to that. If it were just "sick people go outside less" there wouldn't be a big signal for colon cancer in particular. On the downside, chronically high but non-toxic levels of vitamin D might accelerate the calcification of the pineal gland, causing melatonin deficiency, causing poor sleep, causing a variety of other health problems. I basically can't sleep well without completely blacking out all sources of light -- streetlights through blinds are way too much light for me.
If the etiology of death by covid is mostly the immune system overreacting and attacking the lungs, and we have other evidence of vitamin D being helpful in autoimmune conditions, our prior for vitamin D benefitting covid patients should not be that low. Dexamethasone reduced covid mortality by half just by blunting the immune system's overreaction against the lungs. Vitamin D is probably not as potent as dexamethasone but more study is warranted. My prior on any sort of antinflammatory or immunomodulatory benefitting covid is at least an OOM higher than my prior on dart-chemicals benefitting covid.
Some studies have supported much higher Vitamin D blood concentrations:
83.4 nmol/L : Vitamin D and mortality: Individual participant data meta-analysis of standardized 25-hydroxyvitamin D in 26916 individuals from a European consortium, by Gaksch, et al., PLOS ONE | DOI:10.1371/journal.pone.0170791 February 16, 2017
> 50 nmol/L: Vitamin D status and epigenetic-based mortality risk score: strong independent and joint prediction of all-cause mortality in a population-based cohort study, by Gao et al. Clinical Epigenetics (2018) 10:84 https://doi.org/10.1186/s13148-018-0515-y
~35 nmol/L: Evidence for a U-Shaped Relationship Between Prehospital Vitamin D Status and Mortality: A Cohort Study, by Sadeq et al., J Clin Endocrinol Metab, April 2014, 99(4):1461–1469, doi: 10.1210/jc.2013-3481
77.5 nmol/L: Vitamin D deficiency and mortality risk in the general population: a meta-analysis of prospective cohort studies, by Zittermann, et al., Am J Clin Nutr 2012;95:91–100 doi: 10.3945/ajcn.111.014779
75 nmol/L: Commentary: Additional strong evidence that optimal serum 25- hydroxyvitamin D levels are at least 75 nmol/l, by WB Grant, International Journal of Epidemiology 2011;40:1005–1007
and finally the big kahuna of them all:
110 nmol/L: An estimate of the global reduction in mortality rates through doubling
vitamin D levels, by WB Grant, et al., European Journal of Clinical Nutrition (2011) 65, 1016–1026
But wait! There's more! Namely, magnesium!
Magnesium, vitamin D status and mortality: results from US National Health and Nutrition Examination Survey (NHANES) 2001 to 2006 and NHANES III, by Deng et al., BMC Medicine 2013 11:187. doi:10.1186/1741-7015-11-187
Finally, rather than being a boring bone chemical, there have been some studies on the relationship between Vitamin D levels and Depression:
Vitamin D deficiency and depression in adults: systematic review and meta-analysis, by Anglin et al., The British Journal of Psychiatry (2013) 202, 100–107. doi: 10.1192/bjp.bp.111.106666
"One case–control study, ten cross-sectional studies and three cohort studies with a total of 31 424 participants were analysed. Lower vitamin D levels were found in people with depression compared with controls (SMD = 0.60, 95% CI 0.23–0.97) and there was an increased odds ratio of depression for the lowest v. highest vitamin D categories in the cross-sectional studies (OR = 1.31, 95% CI 1.0–1.71). The cohort studies showed a significantly increased hazard ratio of depression for the lowest v. highest vitamin D categories (HR = 2.21, 95% CI 1.40–3.49)."
Vitamin D Supplementation Affects the Beck Depression Inventory, Insulin Resistance, and Biomarkers of Oxidative Stress in Patients with Major Depressive Disorder: A Randomized, Controlled Clinical Trial, by Sepehrmanesh et al., The Journal of Nutrition November 25, 2015; doi:10.3945/jn.115.218883.
" Baseline concentrations of mean serum 25-hydroxyvitamin D were significantly different between the 2 groups (9.2 6 6.0 and 13.6 6 7.9 mg/L in the placebo and control groups, respectively, P = 0.02). After 8 wk of intervention, changes in serum 25-hydroxyvitamin D concentrations were significantly greater in the vitamin D group (+20.4 mg/L) than in the placebo group (20.9 mg/L, P < 0.001). A trend toward a greater decrease in the BDI was observed in the vitamin D group than in the placebo group (28.0 and 23.3, respectively, P = 0.06). Changes in serum insulin (23.6 compared with +2.9 mIU/mL, P = 0.02), estimated homeostasis model assessment of insulin resistance (21.0 compared with +0.6, P = 0.01), estimated homeostasis model assessment of b cell function (213.9 compared with +10.3, P = 0.03), plasma total antioxidant capacity (+63.1 compared with 223.4 mmol/L, P = 0.04), and glutathione (+170 compared with 2213 mmol/L, P = 0.04) in the vitamin D group were significantly different from those in the placebo group"
If I recall correctly, there were pretty robust metanalyses showing that Vitamin D supplementation helped recovery from respiratory infections long before COVID. Not cure-all good, but still meaningfully good.
Pretty much everything in the body is the same. Take out 100% of X, and you die horribly, whether it's a vitamin or a microelement. This doesn't mean that having extra X is a good thing - sometimes it's harmless, sometimes it kills you just as horribly (you should have seen the look on my doctor's face when she saw my potassium levels).
What we're doing here is looking for the very few X-es where supplementing is doing a net good with a low risk. That's a more complex issue, because it's basically a question of environment and luck. If X is too low, why? Humans have evolved to thrive on close to the most varied diets in the animal kingdom, so how come we don't get enough of that specific "X"? The reason put forward for vitamin D was light exposure, and is part of the reason why it's a lot more studied than something random by cobalt. But we'd have expected to find something by now.
> I’m not sure why they flip-flop between “lower doses are better” and “lower doses are the same”,
I am not a scientist and I don't read many studies, but it's weird to me how many times I read people like Scott talking about studies and see these sorts of incongruities in the text of the study.
As someone who writes scientific papers, even with multiple co-authors going over a manuscript and sending it out to colleagues for pre-publication review, and peer-review during the publication process, it is still shockingly easy to both make these errors and fail to catch them.
My best-guess explanation is that for small, phrase-level errors (small as in length in the text not in magnitude) like this, when you are familiar with the content and know what it _should_ be, your brain is very very good at skimming it and filling in what it's supposed to be.
As authors we know this (usually learned through experience) and try to catch these mistakes, but in the process of writing, it's likely that dozens get made at some point in one draft or another so the fact that a relatively large number of papers have 1 or 2 make it into the final publication draft does not surprise me that much, as unfortunate as it is.
Seconded. Of course we try to be as careful as possible, and I personally proofread every letter of a paper at least twice just before publication, but as the saying goes: "The total number of errors in your paper is at least N+1, where N is the number of errors you found in proofreading."
Maybe it would help to have a reviewer from a completely unrelated field? Someone less likely to autocorrect small mistakes?
My understanding regarding covid was more that there was a correlation between higher vitamin D levels and faring better with the virus, including long covid. Which leads to some experimentation with giving vitamin D as treatment, but really levels need to be up before then. Also, just anecdotally, my friends and I who have thyroid issues all have had lower vitamin D levels less than 30 ng/mL (which is what one study mentioned as a threshold back in 2020) and have all been advised by our doctors to take D3 w/K to supplement daily.
And just another thought is that nothing about our body is boring. Underestimating the mystery we are still exploring is definitely going to lead to us missing something.
Right on
Yes, exactly.
For every chemical compound we know about, there are probably a dozen related compounds we don't know about. And for each one of these, there's probably a dozen or so positive effects of increasing the dosage, and another two dozen or so negative effects of increasing the dosage. Medicines IRL are not like video-game meds where more = better.
Scott is making two classic mistakes: First in binary thinking; Second exercising first level thinking.
Outside of perhaps polonium, any chemical is neither good nor bad, there are multiple tradeoffs to dosages. Chemicals interact with each other chemicals, chemical receptors interact with different chemicals, cellular chemical interactions interact with other cellular chemical interactions. Its way complex, too complex to even imagine.
I don't see where Scott is making either of those mistakes.
Scott isn't making those mistakes. And despite your last paragraph, we can still make blanket statements about the effect of ingesting certain things by doing well designed studies. Just because something is complex doesn't mean we can't know anything about it.
You're not really making any arguments here - just giving us more anecdotes which are totally useless. If you could produce a study that showed, for people with less than 30 ng/mL, taking the dose you're taking daily would lead to a longer or better quality life in real people, then we have something to discuss. No such study exists, and your anecdotes about having a prescription from your doctor prove nothing.
This works both for and against Vitamin D being relevant to COVID. It could be sunlight generally, warm weather, being outside, Vitamin D, some other sunlight/outdoor/warm weather-related chemical, or a combination of any of these. Of it could be a coincidence or related to some third factor that affects both time spent in the sun and COVID - for instance the fact that healthy people are more likely to be outside, and old people are least likely to spend lots of time outside and also most likely to get sick and die.
Vitamin D is considered a likely culprit because it's an identified chemical that our bodies seem to use for good effect, but it's hardly the only possibility.
Vitamin D levels, since they are linked to sun exposure, get correlated with all sorts of things. Multiple Sclerosis is relatively more common in the Pacific Northwest and one theory is that it has something to do with a lack of vitamin D since the area is often cloudy and rainy.
If you move your weekly pinochle game outdoors, odds are you will be less likely to get COVID if one of the other players is infected. Is it the sunlight and increased vitamin D? Is it the better air circulation than in that stuffy back room? Is it the exercise from chasing the cards when a gust blasts them off the table?
Again Vitamin K supplements and excessive usage should be medically evaluated and I am not a big fan of running to the doctor.. at all.. having worked in medicine. D and C and Zinc are good for immune boosters and also natural sunshine. :) yep.
> If you want to go higher in that range, you can trade off a tiny mostly-theoretical risk of a very mild insufficiency for a tiny and mostly-theoretical risk of a very mild toxicity.
Is there any research about what that risk of toxicity could be? I assume that Hoffman is currently taking huge doses of vitamin D to match his assumptions about ancestral populations. Is there a specific reason to believe that he is in danger here (as opposed to generic "too much of anything can be dangerous" reasons)?
I looked into this a while ago; one of the studies I found looking at case reports of observed toxicity was from https://doi.org/10.1111/cen.12836 (available on sci-hub), whose patients had been given between 30,000 and 200,000 IU per day for at least a month. I haven't found any mention of toxicity from daily doses under 20,000 IU in the literature, but definitely could have missed something.
In discussions of toxicity, I would think some consideration might be given to the propensity of high levels of vitamins D to calcify soft tissue. K2 sufficiency may be a relevant factor in soft tissue calcification.
"5.1. Hypervitaminosis D and VC
Induction of calcification through hypervitaminosis with vitamin D has been demonstrated and well characterised in multiple animal models, including mice, rats, goats and pigs (see Table 1). Treatment of rats with sublethal doses (7.5 mg/kg) of vitamin D plus nicotine produces a lasting 10–40 fold increase in aortic calcium content, resulting in the calcification and destruction of medial elastic fibres, subsequently leading to arterial stiffness [84]. In goats and pigs, dietary supplementation of vitamin D promotes the development of aortic and coronary calcified lesions in association with elevated serum calcium and cholesterol levels [85,86]. Vitamin D induced calcification in mice is currently considered to be one of the more robust models of calcification, in which single doses of 500,000 IU/kg/day can produce severe aortic medial calcification after just 7 days following 3 consecutive days of initial treatment [87]. Interestingly, a recent study produced a variant of this model in which mice initially treated with a lower dose (100,000 IU/kg/day) for 7 consecutive days developed moderate aortic calcification outcomes at 28 days (unpublished). In addition to precursor forms of vitamin D, such as D2 and D3, dosing of its active metabolite, calcitriol, also produces diffuse and widespread soft tissue calcification that has been demonstrated in a time-dependent manner in rats [88]. Despite the number of in vivo models, evidence to explain the clear mechanisms of action by which excess exogenous vitamin D promotes calcification is still lacking."
https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5986531/
Aren't the doses used here ludicrously high? 100k IU per kg per day on the low end?
Toxicology studies are often like that. They try to substitute high doses for long time frames. Yes, it has issues. But so does trying to run an experiment for 40 years.
I would argue that this approach, substituting high doses for long time frames, is inherently useless. I can't see any way to avoid producing blood concentrations higher than you EVER get in the real world, assuming the human body can reliably use what it needs, dispose of some amount, and only store what's left after that point if anything. The fact that the alternative can't be done doesn't make this option correct. If you have no good options, the least bad option doesn't magically become good enough to rely on.
I think the statement that vitamin D is "a boring bone-related chemical" can be easily dismissed.
A recent high quality study on 25000 people found it is effective for autoimmune disease: https://www.bmj.com/content/376/bmj-2021-066452
As for COVID, we already discussed it enough, but worth mentioning there's a new study showing strong results (RR 0.23): https://www.sciencedirect.com/science/article/pii/S0188440922000455?via%3Dihub
Bouncing off that, summary of my post: seasonality, vaccination status, and hey that long covid risk 3x with Vit D in the study?
One thing is that Vitamin D levels are seasonal, and seasonality of studies should be considered in these analysis. E.G. the RR 0.23 study started in June through Dec 2020, vs. CORONAVIT on ARI prevention started in Dec through June 2021 (unclear, october is mentioned in the text for evaluating participants for inclusion, first month of followup was December (so started in November?). If Vit D takes a while to ramp up, and it's ramping up when cases are highest, that's going to reduce the effect. Measuring vitamin D during the end *during the summer* to compare groups, I did not understand that. I know studies don't have infinite budgets but measuring vitamin D levels is cheap, doable by post it seems, and measuring them at onset of illness or at monthly follow-up would really help understand everything going on.
Vaccination especially right after has a very strong effect on COVID-19 infection and a serious potential confounder. Table S3 reports infections and long covid by vaccination status *at the end of the study* not at time of infection!? When during the study vaccination happened is important but not analysed. Frankly the "participants looked the same % in June" in the appendix IMO was not sufficient, when did vaccination happen affects time with exposure. Yeah yeah it's supposed to be an R CT but most RCT not designed to happen with the biggest intervention known to work concommittantly.
I don't get why no one is talking about the frankly scary 3-4x RR for long covid for those on vit D in the study. Yeah it is noisy and not powered for that outcome but that is scary. I am guessing there is a non-linear effect for too high vitamin D, sadly particularly among women (who suffer from auto-immune disease more). LC risk also potentially confounded again by vaccination status at time of infection.
In general with vitamin D and with anything with threshold/sufficiency/non-linear effects, things are really complicated. Scott's post talking about averages, that is a start, but maybe all this stuff is complicated and not agreed on b/c people have trouble thinking through all that. Any time I see a vitamin D study without an analysis that is at thresholds/sufficiency as well as "on average" it is just weird to me.
Good points. To fully compensate for modern diet and indoor lifestyle by supplementation alone would probably require a complex customized protocol.
If possible, best option is just be outside and eat well...
That first study was only in older adults (mean age 67 ish), was barely statistically significant, will likely be unreplicated for many years, and only reduced the incidence of autoimmune disease from about 12 per 1,000 to 10 per 1,000 - doesn't seem particularly interesting or surprising.
Note that the effect accumulates over time, and in later years it becomes strong (-39%) and very significant (p=0.005).
Anyway, the point was that claiming vit D doesn't have any effect beyond bones is clearly wrong. That's all.
Yes, there are always ways to cut up the data to get statistical significance in some subgroup or over some time period. You must have missed the part where Scott literally says: "I accept there are Vitamin D receptors on immune cells and a couple of other things, and it probably plays some role in modulating those." It's debatable that the study justifies any positive statement about Vitamin D's ability to prevent autoimmune disease, especially in the absence of any replication or further studies, but no one here is claiming that it doesn't have any effect beyond bones.
Scott, no reply about the Mexico City study that Saar links 2nd above? My detailed analysis of this study vs the negative CORONAVIT UK study can be found here: https://twitter.com/KarlPfleger/status/1516918243390156800
This article inspired me to take a Vitamin D pill (1000 IU).
I was at 16 ng/ML 2 weeks ago. Started to supplement at 50K IU a week for 3 months, will check my blood again when done. Out of curiosity, if readers here were to make me a market for my Vit D level at the end of this period (even odds buy/sell levels), what would it be?
Are you taking one dose weekly? I don't really have a great citation for you, but the chemically active metabolite is two reactions downstream from the supplement chemical. I'd think there's a risk of the kidneys filtering out A such that the concentration of B eventually falls too low to keep up production of C.
Go re-read comments, find the ones about the negative effects of tissue calcification, and we'll lay odds on you surviving a week.
Here's hoping you don't go all crunchy.
If I'm understanding the equation on the graph above correctly, it should even out at about 51 ng/ml.
This changed my mind -- I had agreed with Ben for the last year, and I now I agree with you instead.
There is folklore that Vitamin D provides some benefit against the winter blues (SAD). A quick Google search has some sites which say the evidence is 'mixed' for this purpose but also that some studies claim Vitamin D increases levels of serotonin in the brain which something something helps with depression.
How credible should I find this theory? I've been taking Vitamin D supplements for years and anecdotally they seem to be helpful (weeks I don't take my supplements I generally feel worse - of course, this could be correlation not causation...), but it also seems quite likely I'm just a master placebomancer.
Given the great many beneficial things Vitamin D has wrongly been credited with, I think your prior should be very low that it helps with SAD.
Balloons help with SAD
Placebos should not be discounted
Well, I guess my response is that I wouldn't say "Balloons help with SAD". I'd say specifically that "balloons help generate a placebo effect that help with SAD". This seems to more accurately represent reality and what people are talking about when they use these words.
The placebo effect might just be regression to the mean.
I'm not sure. Seasonal Affective Disorder is definitively linked to light input (we know that because really intense light supplementation is an effective therapy.) That automatically puts the prior on Vitamin D being linked to it notably higher than a random chemical.
A decent prior seems appropriate (10% maybe.) I'm not convinced that Vitamin D is helpful, but it wouldn't surprise me if it were. (Whereas it WOULD surprise me if Vitamin D helped with COVID or the flu or heart disease.)
Hmm, I'm also not sure.
Just like you have a plausible story about how Vitamin D might help SAD, I think probably there are stories you could tell about plausible reasons to believe Vitamin D would help many of the other things people thought it would help? I'm not an expert in the history of Vitamin-D-will-help-this-thing theories.
My prior is that there's infinite numbers of incorrect but plausible stories you can tell about a phenomenon.
You might put Vitamin D in the class of theories that had plausible stories but didn't work out and not in the class of theories that said it'd help with COVID or whatever. My issue is that I'm not sure has-plausible-story should raise your prior much.
Counterpoint: dawn simulators and lightboxes help with SAD, and you're not getting any vitamin D off them.
If anything, that should make us update away from vitamin D doing anything - the causal agent is something related to sunlight, like... just perceiving light. Or melatonin and fixing the circadian rhythm (my bet). Or something else entirely.
Why wouldn't you be getting vitamin D off of "dawn simulators and lightboxes"?
They produce light similar to sunlight – and there isn't any vitamin D in the solar wind as far as I know.
Maybe you're pointing out that _some_ lights don't emit UV (or other) frequencies? Some lights _do_ emit UV tho.
I wouldn't be surprised if bright (visible spectrum) light like sunlight is _also_ helpful for other reasons like you mention tho.
Neither lightboxes nor dawn simulators emit significant amounts of UVB, by design - UVB is a strong carcinogen. You can use a tanning bed, I suppose.
Are you sure? These were at the top of a Google search for "sunlamp uvb" for me just now:
* https://www.amazon.com/Alaska-Northern-Lights-Sperti-Sunlamp/dp/B01LBI1BIO
* https://www.amazon.com/TEKIZOO-Intensity-Self-Ballasted-Basking-Amphibian/dp/B07Y9MBBFQ
Can you tell that they don't emit "significant amounts" of UVB?
It's a "strong carcinogen" – but not _that_ strong; not literally 'don't ever go outside during the day if you can help it' strong.
I find the standard warnings about 'tanning beds' to be modestly convincing!
But I suspect that there are 'sunlamps' that provide UVB like/similar-to sunlight, and so should mimic whatever effects UVB has on us thru exposure to sunlight.
It wouldn't surprise *me* if vitamin D were linked with reactions to COVID, the flu, or heart disease. But the effect could go either way. (From the above comments I'd expect that high levels of vitamin D might be linked to increased probability of heart problems. Inflexible blood vessels don't sound good.)
Characterizing vitamin D as a boring bone related vitamin is a little weird to me. I'd considered it more of a testosterone precursor; this RCT
https://pubmed.ncbi.nlm.nih.gov/21154195/
Found substantial increases in t levels after a year of supplementation.
In any case it's cheap and not likely to be toxic. Gonna keep taking my Pascal's wager in the form of my 4000 IU/d.
"In any case it's cheap and not likely to be toxic. Gonna keep taking my Pascal's wager in the form of my 4000 IU/d."
This is the part I don't really understand. To quote Tim Meadows, "it's the cheapest drug there is." So maybe it's beneficial. It can't hurt you (marijuana can folks, great joke but not equivalent), it costs nothing, takes a half second out of your day. Even if the 50 studies that all generally indicate it's beneficial are all somehow just misinterpreting a boring bone drug, meh.
Go read the comments about tissue calcification. This means calcifying your arteries and you die of an aneurism or heart attack. Some studies killed rats within a week.
Within a week yes, but of doses thousands times higher than any discussed here. I would have to somehow daily consume three containers of 365 tablets containing 4000IU each to reach anywhere near the doses in the calcification study. I'm going to ignore that study, the same way I ignore any study where I would have to daily ingest my body weight in iron filings, drink a bathful of water, or munch a cubic meter of shredded wheat to reach the dose studied.
You see an unrealistic dose, I see a man afraid of a challenge.
Yes, but the leading cause of death is arteriosclerosis, which is calcification of the arterial lining. If Vit-D is a contributing factor, and its already the leading cause, perhaps Vit-D balance is, or could be a problem.
Sure, this could well be the case. However, the study we are commenting on doesn't seem to be strong evidence, just like a study measuring the LD50 of water is not that useful when trying to decide how much water to drink daily.
Technically it costs a few dollars a month.
Everyone needs to learn more about hepatic metabolism
"because they confused correlation and causation (sicker people have less vitamin D)."
I think it's weird how little this kind of context is discussed in regards to Vitamin D. Shouldn't all Vit D studies be controlling for CRP or some other marker for inflammation? Also, what about the risk of arterial calcification if Vit K2 levels are too low? I think we could even give Vit K2 a lot of the credit for increasing bone strength currently given to vit D. As some have remarked, a piece of chalk will appear bright white if you're just looking at bone mineralization. But it's easy to break.
I admit to not having checked in on this topic in about a decade, but the calculations still just seem a little... acontextual.
Also, I've sometimes wondered, given that statin drugs reduce cholesterol and vitamin D is produced from cholesterol, what portion of the benefit of statin drugs is related to its reduction in vitamin D levels in individuals with chronic inflammation. Perhaps that's an errant thought, I admit, but I wonder.
> The nationwide average is about 27 mg/nl, but black people (whose dark skin blocks sunlight) are almost all insufficient and bring down the average; for whites, it’s about 30 ng/ml.
Are American blacks almost all at insufficient levels of vitamin D, or do they just require less of it?
> a typical model of “deficiency” (technically insufficiency, you’re not supposed to use the word “deficient” unless there are observable health consequences)
Wait, so we're *stipulating* that there are no negative effects, but we're calling them "insufficient" anyway, just for fun?
I think the first (by usual standards of insufficiency) but there's not great evidence that they're really suffering because of it. Some people attribute higher levels of schizophrenia in this population to Vitamin D deficiency, but I think it's a stretch. Probably a non-psychiatrist would know of some other problems it's causing them.
> I think the first (by usual standards of insufficiency) but there's not great evidence that they're really suffering because of it.
I don't get it. I ask whether clinical "insufficiency" has anything to do with "insufficiency" in the ordinary English meaning of the word, and you respond by repeating that they really really are clinically "insufficient"?
If the African-American level of vitamin D isn't causing any problems, why are we calling it "insufficient"? What is it insufficient for?
Baseless theorizing: "insufficient" means problems are observable in a big enough population, "deficient" means they're observable in the individual?
Sorry, I didn't understand what you were asking.
I don't think there's any evidence that American blacks require any less Vitamin D than anyone else, so if the usual standards for Vitamin D are right, they are probably suffering from not having enough of it. I don't know enough about calcium metabolism to know the exact way they are suffering, but I would expect it to involve at least higher chance of bone problems and maybe other things too.
> I don't think there's any evidence that American blacks require any less Vitamin D than anyone else, so if the usual standards for Vitamin D are right
But I think this is a shocking position to default to. The evidence that American blacks require less vitamin D (in terms of bloodstream concentration) than American whites do is precisely the fact that they systematically have less of it. You begin from the assumption that people's bodies are working correctly, not from the assumptions that (1) everyone's bodies are meant to function in exactly the same way; and (2) we have perfect knowledge of what that way is. We know that both of those assumptions are false!
Why aren't we saying that Asian American men have chronically excessive levels of testosterone? Why aren't we saying that *blacks* have chronically excessive levels of testosterone?
Black people are native to tropical climates (in particular, sub-saharan Africa); their dark skin causing vitamin D deficiency further north is quite likely, given that one of the theories of why light skin evolved to begin with was to deal with vitamin D deficiency.
Not to mention the fact that black people do, in fact, have worse health outcomes overall.
No it's not. We're talking central biochemistry, the kind of stuff that is found essentially unchanged as far back in our evolutionary tree as our rodent ancestors, whereas skin color is a very late diversity in our family tree. The reasonable default position is to assume that central biochemistry works the same in all races, absent some evidence that for some odd reason a very late change in a small set of genes (coding for skin color) has also changed some core metabolic function.
You'd have a much better argument if the issue was a distinction between sex, since sexual dimorphism goes back hundreds of milions of years, and it *is* reasonable to assume that even core biochemistry may differ somewhat between the sexes.
I'm going to have to repeat my question:
>> Why aren't we saying that Asian American men have chronically excessive levels of testosterone? Why aren't we saying that *blacks* have chronically excessive levels of testosterone?
This is not a difference between sexes; it's a difference in core biochemistry between males of different races. And that's the norm; we expect noticeable differences in *everything*.
> The reasonable default position is to assume that central biochemistry works the same in all races
This is absolutely not a reasonable default position. Every time we look for racial differences in any aspect of human biology, we find them. This "default position" has been falsified _every time_ it's been investigated.
I've always had a similar theory regarding African Americans and hypertension. When 80-90% of otherwise perfectly healthy black people all have "high" blood pressure, maybe it's not actually high.
Only 41% of black people in the US have hypertension, and given that a lot of black people are fat and end up dying from cardiac issues, it seems unlikely that this is some "they have a different range" thing and more of a "they're fat and die of heart disease" thing. They do have reduced life expectancy.
Moreover, the rate of hypertension in blacks in the US is higher than it is elsewhere, again suggesting it isn't genetic but related to obesity/lifestyle. Outside of the US, black people don't actually have particularly high blood pressure.
I'd recommend looking into this further. There's quite a lot of recent research on Vitamin D levels in Black Americans, and much of it points to inter-racial differences in what constitutes deficiency. For example, Black people have lower total (many studies only measure serum concentrations which is much easier but also less accurate) 25(OH)D, and much lower levels of Vitamin-D binding protein. 80% (!) of variation in the latter was explained by genetic polymorphisms. Lower VDBP significantly increases the bioavailable fraction of total 25(OH)D.
https://pubmed.ncbi.nlm.nih.gov/24256378/
Humans and mice that don't produce VDBP at all have very low levels of Vitamin D3, but *do not* have more bone problems as a result.
https://www.frontiersin.org/articles/10.3389/fendo.2019.00910/full
Now, the research is definitely not unequivocal on this, and there are studies suggesting that Vitamin D deficiency is widespread in Black mothers and this explains some of the disparity in pregnancy outcomes. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3222336/
But what it does show is that we probably can't measure Vitamin D deficiency and insufficiency in the same way in Black populations, which could certainly impact the accuracy and interpretation of results from several of the studies in this post, including the Hadza data.
On an evolutionary level, I would expect that a reduced need for vitamin D would be more likely to evolve in the group that had sustained low vitamin D levels for long enough to change the color of their skin to try to make up for it. People whose skin is still dark are likely to have bodies that are adapted to higher levels of vitamin D than people whose skin has lightened to try to make up for ancestral deficiency.
Which brings up an interesting question for Scott. If Vitamin D is really just a boring bone chemical, the increased risk of broken bones is what drove the evolution of pale skin in northern latitudes? Do we accept that this is a thing which could happen or are we going to reevaluate our reckoning of where pale skin comes from.?
If you'd like to know way more, and probably too much, about vitamin D, this series is good: https://www.devaboone.com/post/vitamin-d-part-1-back-to-basics
That’s my wife’s article, just finished reading the article above and I’m going to forward it to her for some thoughts.
Your wife's a great medical communicator, I think her take on vitamin D is far superior to Scott's here. Please do let her know that her summary of the VITAL trial as finding "No benefit" is now false, as secondary analysis has found a significant 22% reduction in autoimmune disease incidence over 5 years: https://www.bmj.com/content/376/bmj-2021-066452
Thank her for writing it. I originally found it, and you, via HN.
It's a good take on side effects of extra vitamin D. Biased due to the author's specialty, but still a good reminder that yes, there are side effects.
The big update here is that before recommending people larger doses, one should ask them if they had their levels checked recently, in the off case of already being high. But like Scott said, chances are pretty small.
Thanks for sharing. The biggest news for me there is that she sees that people with over 70 vit D levels often do have toxicity. She suggests 30-60 is the real range to aim for.
My memory is that the first time I thought vitamin D might be useful for covid was when I read articles pretty early on in the pandemic that claimed that African-Americans and South-Asian-Americans had the worst results in America, but Africans and South Asians were actually doing pretty well. There are other things that could explain this (mostly that countries near the equator are not logging all the data on deaths), but dark-skinned people in higher latitudes having the worst outcomes seems to fit the vitamin D story pretty well, and makes my prior higher than yours that vitamin D does something for Covid.
I recall hearing such studies about South Asians in the UK. But India seems to have been hit significantly harder by COVID than Africa (perhaps that's a bias in what manages to get reported in the US).
Yeah that’s fair. I’m thinking back to articles that came out before the delta wave that really hit India hard, but the hypothesis is definitely weakened by how India has done since then.
> Some people originally thought Vitamin D did all those things. They mostly thought this on the basis of studies, done at low doses, which found that it did. Those studies mostly found that it did because they confused correlation and causation (sicker people have less vitamin D). Then we did better studies (still at low doses) which found that none of those things were true after all, at least at the low doses which the studies investigated.
> If we then say “Yeah, but it could still be true at higher doses”, we’re missing the point. Now that our original reason for thinking it’s true is no longer valid, we should be back to our prior for any given random chemical, like hydroxymethylbilane.
Among other things, Andrew Gelman occasionally writes about the problem in science publishing of chronological non-independence. He points out that in reality, the truth is not affected by whether you do study A first and follow it up with study B, or whether you begin with study B and follow it up with study A.
The problem is that the norm in publishing is that someone does a low-quality, low-information study that is so bad that it can produce a spurious finding of statistical significance. (This is easier for bad studies to accomplish than it is for good ones. Incentives!) That study has a surprising, interesting, and "statistically significant" result, so it gets published. It is now Official Science, because it's published.
Someone else will then do a better study finding no significant effect. This is also easy to do, because the effect the first study found was an artifact of chance and low quality. And because the first study was published, the second one can be too. But nobody is willing to admit that the second study removes every reason to give any credence at all to the first study. The effect in the first study is real, by definition, because it's published. So the second study is always taken as refining the result, not as rejecting it. You can't reject an effect that is really there. Perhaps the effect is only present when the researcher's last name starts with P.
Gelman writes that this is ridiculous because -- among several other reasons -- if the studies had been done in reverse order, everyone would come to the opposite conclusion, rejecting the idea that the effect was real under any circumstances.
I'm reminded of one of the early rationalist efforts involving using more light to treat seasonal affective disorder (SAD): https://www.medrxiv.org/content/10.1101/2021.10.29.21265530v1
The attitude was that is an obvious thing to try, that it was easily and safely testable, and that it could help people. It looks like a civilizational failure that no one has tried it yet.
I think that the solution to the riddle is that most of the people trying to treat SAD are doctors and that doctors have justifiably strong priors against dramatically increasing the dosage of anything. "What if we tried more power?" is a refrain for scientists & engineers, not for doctors. Even the safest drug might do nasty things if used in high enough doses. Even water has an LD50.
The particular example of treating SAD with more light is distinct enough from the things that doctors build their intuitions on, that we shouldn't be surprised when they miss something.
> yes, this is the second scientist in this essay studying sun exposure with “Lux” in their name
Unrelated to Vitamin D, but related to light: One ought to do RCT on nominative determinism in general and light-related names in particular. During my Masters I read a seminal paper on light coming from black holes, by Jean-Pierre Luminet (https://en.wikipedia.org/wiki/Jean-Pierre_Luminet).
Can't forget the Lumière brothers, who did some of the first motion film: https://en.wikipedia.org/wiki/Auguste_and_Louis_Lumi%C3%A8re
So, I might be overly emphasizing individual experiences, but I am trying to figure out how a modern urban/suburban office working human is getting anywhere enough sunlight to approximate ancestor sun exposure, even assuming damp little northern island ancestors who wore hats and bonnets a lot.
(I am also remembering time in Korea and other Asian countries, where middle class office working gals walked outside with a purse on their shoulder to keep their faces shaded, at least until they were married, as tans were seen as quite low class and not pretty. So maybe my perception of ancestor exposure is not entirely accurate.)
To me, assuming that diet replacement is sufficient is...assuming a lot? Even reading Scott I am not convinced otherwise.
I think this just something (like many others) that is tricky to figure out exactly.
Just because our ancestors adapted to, and managed to survive and reproduce living in, some particular conditions doesn't mean that those conditions were somehow _exactly_ perfect for us.
It's _possible_ that we just 'put up with' a lot of 'abuse' from the environment. (And it's also possible that that abuse was something like 'useful stress' too!) And given that everything is entangled-with/correlated-to almost everything else, it's possible that it's just not practical to ever get a clear answer as to what the 'perfect' dose is for anyone, either in groups or in particular.
Sitting in the morning sun on my east-facing balcony is my favorite source of vitamin D. It's best applied with dark coffee and a sativa/Indica blend. As I haven't got covid-36 yet, that makes the process postmodern-clinical proof the process cures cancer.
The reason for the persistent association of ill health and low vitamin D is most likely confounding. Deficient animal meat intake and low sunlight exposure are independently associated with unfavourable outcomes, and vitamin D mostly serves as a marker for both.
But since there is an anti-meat campaign (supported by all kinds of special interests) on the way, and an anti-sunlight campaign (supported by mostly the same special interests) as well, none of that will get rectified in the near future, so we can enjoy tedious debates such as this one, whether a boring bone hormone somehow will prevent you from dying.
The vitamin D supplementation bandwagon has been going on long enough, can we please put it to sleep?
I'm giving this comment a minor warning (25% of the way to a ban) - it seems kind of contemptuous, paranoid, and doesn't really explain itself. I feel more comfortable doing this since it's agreeing with me - a lot of people who disagree with me are worse but I have to worry about real or perceived bias.
My bad. Long form (barring the first paragraph, which I think is adequate):
Caution against sunlight exposure has arisen from an epidemiological association of sunlight exposure and sunburn with melanoma. This association was never really strong, though (RR 2.0), and did not appear in all studies. It was largely driven by reactive physician anxiety resulting from an epidemic of skin cancer overdiagnosis (a summary: https://www.nejm.org/doi/full/10.1056/NEJMsb2019760). It also has serious holes: logic would dictate that most melanoma should appear on the face and hands since exposure is high there due to a lack of clothing, yet this is not the case (more summary of contradictions in this body of work: https://www.bmj.com/content/343/bmj.d5477). Why did we go this direction? Nobody really knows, it appears to be the usual confluence of faddism and groupthink in academic medicine. There is an elephant in the room though: the sunscreen industry. Increasingly aggressively high SPF lotions (SPF 50 to 100 being unheard of even fifteen years ago) are being marketed to everyone, virtually guaranteeing that nobody develops any tolerance to UV light the natural way anymore and keeps buying more sunscreen. Sunscreen is now sold to black people, who don't need it, but marketing does the trick I guess (https://www.outsideonline.com/health/wellness/sunscreen-sun-exposure-skin-cancer-science/ this is a shoddy article and probably conflicted in interest but the interviews are gold). All the while another association has popped up: sunlight exposure and longevity/some markers of general health (https://pubmed.ncbi.nlm.nih.gov/24697969/). This keeps being ignored, mostly, and it will likely take years before anyone does conclusive interventional studies on this, since all IRBs will run amok due to being biased (UV is far too dangerous to do an interventional trial, yada yada). Exemplarily, a confluence of a mediocre academic belief with industry interest results in a deadlock which potentially jeopardises people's health, yet is guaranteed to persist without any challenge being possible at all.
The same thing with meat consumption: academics championed the idea that meat consumption is responsible for coronary artery disease based on legendarily bad evidence in the 1960ies. This immediately was supported by the food industry, who was busy hiding the fact that they were killing people with sugar (check Cristin E. Kearns' work for the full monty https://sugarscience.ucsf.edu/cristin-kearns.html) and eagerly jumped on the fat-heart-disease bandwagon. The evidence is decidedly mediocre (https://pubmed.ncbi.nlm.nih.gov/30958719/ and https://bmcmedicine.biomedcentral.com/articles/10.1186/1741-7015-11-63 and a few hundred studies just like this). Sometimes, the association appears rely entirely on processed meat, which like all processed food is associated with poverty and thus likely just another measurement for low social economic status (I don't have the space to delineate the entire controversy here).
Meanwhile, the societal trend towards animal rights has led people towards vegan diets. Evidence is now trickling in that veganism is associated with all kinds of disease (https://bmcmedicine.biomedcentral.com/articles/10.1186/s12916-020-01815-3 or maybe this one https://pubmed.ncbi.nlm.nih.gov/32483598/ ...), but again we will have no conclusive evidence for years to come (there is typically not even an attempt to study diet interventionally).
There is also an elephant in the room, though: the food industry is not particularly fond of meat. It is a low tech product (a gun and some knives are all that is needed for production), the raw material (raised and finished animals) are expensive as hell, it needs manual labor to boot and production of high-value meat is decentralised in mom and pop specialty butcheries. The only money to be made in meat is by exploiting immigrant laborers in giant slaughtering and packaging plants with extremely poor working conditions, frequently aided by a currency exchange gradient (remember the COVID disaster in the meat plants? This is how we got there).
The food industry would more happily sell vegan meat substitute, which is cheap to make from cheaply farmed cash crops, even more cheaply produced in a highly industrialised factory and sold with a gargantuan markup.
Again, a confluence of rather poor academic beliefs with political and industrial interests created a deadlock that potentially threatens population health, bit will hold for the foreseeable future.
Hope that wasn't too paranoid.
> Evidence is now trickling in that veganism is associated with all kinds of disease
The two papers you linked seem to me extremely weak evidence for this. One was about fractures, and I would guess is explained by vitamin d (maybe take some supplement…). The other is about mental health, showing raised depression scores and lowered anxiety scores amongst vegans. I’m sceptical; why lower anxiety? Also, no apparent attempt to separate correlation from causation (though I’ve only read the first page, but this seems like a crucial point).
I am mystified as to the anxiety scores, but this is likely just decreased agitation from the lethargy usually experienced on vegan diets.
Correlation vs causation - this is correct, but unfortunately all we have in nutrition is low-quality non-evidence from correlative studies.
I am more than willing, however, to assume that there is a real detriment to vegan diets, since this has been observed in clinical practice for at least a decade: predominantly young women with all sorts of psychiatric and functional disease and nothing wrong except a vegan diet - usually this quickly reverses upon a few weeks of normal meals. The few people I personally have seen trying vegan diets found them intolerable (rather quickly, the only source of nutrition becomes starch and sugar and processed vegetable oils). I realise this is inadequate for a blog focusing on rationalist ideas, but when there is no real evidence to go around, this is what I am left with.
It would be interesting to see an interventional trial, but as I mentioned, nutrition as a research field is not bothering with those anymore (probably because of a lack of success in proving the fat-heart-disease connection - see the Minnesota coronary experiment and the Sydney diet heart study).
The bone thing is complicated: most strength in our bones comes from collagen, which deteriorates when inadequate amounts of animal protein is ingested. The calcium appears to only contribute a minor amount (hence the wide range of bone mineralisation compatible with life). Our body is rather wasteful with bone calcium, too: in a bind bones are readily demineralised to raise the blood calcium level, and this takes weeks to reverse. I largely think the expedience of measuring bone calcium by X-ray transmittance (DEXA) has unduly focused research on bone mineralisation instead of bone collagen structure, which appears to be more important in preventing fractures anyway, since it is the only thing in bones resistant to shear stress (apatite mostly just resists compression loading). There is no real way of imaging or otherwise analysing bone collagen short of biopsy and staining with tedious image quantification, so if never became of clinical interest.
Edit/addition: in the elderly, who already have demineralised bones, D3 is widely prescribed along with calcium gluconate or calcium carbonate to strengthen bones. As far as I can tell, it is rather useless at that (noone really recovers from osteoporosis, maybe the decline in bone density is slowed a little bit, but the downhill momentum isn't appreciably changed). To me this suggests that (similarlily to sarcopenia, which is not caused by a lack of exercise but by a lack of dietary protein) osteoporosis is largely the consequence of protein malnutrition).
> Correlation vs causation - this is correct, but unfortunately all we have in nutrition is low-quality non-evidence from correlative studies.
this is not true? there are many rcts on veganism and vegetarianism and cardiovascular disease (veganism seems better than the typical western diet, maybe on par with most other diets), weight reduction (vegan diets seem better than typical western diets) and bone health (vegan diets carry some bone health risks, but this is due to vegans more often lacking calcium or vitamin d, and these can be supplemented) among other things.
i summarise many of these studies in this post: https://www.erichgrunewald.com/posts/can-a-vegan-diet-be-healthy-a-literature-review/
here are some meta-analyses of rcts:
- Yokoyama et al., "Vegetarian Diets and Blood Pressure"
- Lopez et al., "The Effect of Vegan Diets on Blood Pressure in Adults: A Meta-Analysis of Randomized Controlled Trials"
- Lee et al., "Effects of Vegetarian Diets on Blood Pressure Lowering: A Systematic Review with Meta-Analysis and Trial Sequential Analysis"
- Viguiliouk et al., "Effect of vegetarian dietary patterns on cardiometabolic risk factors in diabetes: A systematic review and meta-analysis of randomized controlled trials"
- Huang et al., "Vegetarian Diets and Weight Reduction: a Meta-Analysis of Randomized Controlled Trials"
> here is an anti-meat campaign (supported by all kinds of special interests) on the way
well, sure, though it's certainly less of a stretch to just chalk it up to legitimately held morals rather than special interests... I would expect Big Green Bean to lose to Big Animal Husbandry in a corporate proxy war
> and an anti-sunlight campaign (supported by mostly the same special interests) as well
huh? an anti-sunlight campaign? As Scott said, you can't just throw something like that out without explaining yourself
I found it pretty easy to grasp. If you have kids your pediatrician will warn you against sunlight as if they were little vampires, telling you terrible things will happen if you don't slather their delicate little skins with SPF 50 and put on a hat and sunglasses every time you go outside. It can be pretty strong stuff. Maybe they've backed off on this a little since my kids were young, though, which was admittedly 25 years ago or so.
Hmm, thanks for the explanation. Though as a 24 year-old I can't say I ever heard my pediatrician dwell on that. Obviously if you're a pale Irish heritage laddie like me and go to the beach and don't put on sunscreen, you're in for some brutal sunburn, higher risk of skin cancer, and leathery skin once you're 60.
I am always skeptical of phrases like "higher risk", but given that 4 of my parents and grandparents have have to get Mohs procedures done, and my grandparents more invasive surgery, we don't seem to be talking about "doubling your chance of getting hit by lightning" here.
Seems to me like the society I know at least is pretty well calibrated against sunlight exposure risk, but it's good to hear one testimonial to the contrary.
It may be the pendulum has swung back. When I was a kid, in the late Pleistocene, it was considered quite healthy for kids to get plenty of sunshine. A deep suntan was kind of expected in the summer months, and all of us got a sunburn or two in the late spring/early summer, each year, if we weren't sufficiently careful, which we often weren't. Our mothers shrugged with general indifference, figuring the discomfort of itchy peeling skin was enough to teach us to be moderate. But nobody really thought of sunscreen unless you were going to the beach or Florida and expected to be out *all day*. For just mowing the lawn or playing a game of ball? Nah.
When my kids were born, it had swung hard the other direction, and we were advised to avoid suntans, avoid the sun in general, until they were at least in double digit years, and allowing a sunburn would be like mild child abuse. It was a rather surprising shift from what I'd experienced as a kid -- but then, you always figure as a new parent that *your* generation is going to be The Best Parents Ever so I didn't really question the received wisdom.
My impression is that people are more moderate now. People who are unusually sensitive (such as yourself) are advised to take due care, and I don't think anyone advises a sunburn, but even my dermatologist thinks it's a good goal for kids to get plenty of sunshine in the summer, enough to form a moderate tan.
Legitimately held morals seem not to apply when judging agriculture, which relies on the large-scale destruction of habitat and the extermination (by means of ploughing, pesticides and harvesting) of thousands of deer, rabbits, mice, rats and all sorts of birds and millions, if not billions of insects per square mile of farmed land.
I understand why people care less about insects than other animals, but hopefully noone tries to argue that the life of a cow shot dead for the steak is worth less that the life of a stag shredded in a combine.
Likely, the most ethical way of retrieving nutritionally adequate food (in the sense of killing fewest animals per calorie) is hunting game and herding cattle, sheep and goats, provided their fat is used and not burned, as currently. However, since this intersects with the anti-fat bandwagon of 1960ies academic medicine, there is a nice impasse reached again...
You seem to be implicitly making the argument that plant agriculture is as deadly/harmful to animals as animal agriculture. This is something that people really really want to be true, but definitely isn't- if nothing else, because more crop production is for meat than is for human crops.
I’d be interested in your views on this Medcram video. The argument it puts is that people with high vitamin D do better against Covid but that vitamin D doesn’t protect against Covid. That’s because the the thing that actually does protect against Covid is produced by the thing that produces vitamin D. That thing being sunlight. I think his hypothesis is melatonin but the take away for me was to make sure to get outside as often as you can which I already do! https://youtu.be/9eEyWlbToI4
Going a bit further, the potential chain of co-founders is almost endless. People who get outside more tend to get more exercise, be less overweight, eat healthier diets, have better access to medical care, a more supportive group of friends and family, anything else you might come up with.
True. But he does have a hypothesis for why melatonin might be the answer. (At least I think it was melatonin - it’s a while since I watched) Something to do with Ace2 receptors or something equally scientific!
Whenever I read anything like this, I just come away thinking it's a miracle we know anything. Apparently medicine is real hard.
It is a 'miracle'! (Everything's a miracle.)
Even worse maybe are things like vitamin C – messing-up how that works actually killed people (in a noticeable way anyways).
There’s a bracket that shouldn’t be there at the end of the unit conversion paragraph I think
I think a better way of expressing this idea is "circular locating the hypotheses".
There are countless chemicals and microelements that could, in theory, be supplemented with good results. The difficulty is primarily in focusing on the right ones - the actual checking is less bits of information than the focusing.
For various reasons we focused on Vitamin D. Probably because it really helps in a minority of cases, and also because we had a plausible story - less light exposure due to staying indoors and dressed. But once we focused on it, the checking is done and we didn't find anything.
What we're doing now is looking at the vast amount of possible chemicals and saying "hmm, that Vitamin D looks interesting, look at all that fuss around it. I wonder if it's good? Let's check". That's a feedback loop - more you check, more "interesting" it looks, regardless of actual results. The only way to break the loop is to look at the evidence, accept (or not, as the case may be) that it's been studied enough, and move on.
Having lived in the US and Europe, one thing that I noticed was how much less sunlight you get in Europe, at least seemingly. A huge fraction of the US population have ancestors who lived much further north. My ancestors came from the British Isles, Germany, and Scandinavia (AFAIK). Scotland is much further north than Indianapolis, where I group up. Indianapolis is level with Rome. It's quite plausible that, even while being inside for more hours, a person whose ancestors lived in northern europe would be exposed to roughly as much sunlight in a place like Indianapolis. When I moved from California to Moscow, Russia. So, when there, I supplemented with vitamin D for six months in winter. As an academic, at some point I went searching for high-quality RCTs which showed vitamin D supplements boosted immunity. I probably overlooked a bunch of studies, but at the time I couldn't really find anything too impressive. Nevertheless, b/c of the online hype, and total lack of sunlight in the northern European winter, I'd supplement with vitamin D daily. In summer I try to get outside for at least 30 minutes a day and be active.
I can see the attraction of vitamin D pills. Who doesn't want to believe in a magic pill that makes you healthier at almost no cost? To my reading, that "pill" is to get exercise, eat a balanced diet with vegetables, fruits, seafood and meat, and don't do things that interfere with your sleep schedule. Go light on alcohol and don't smoke. Do some kind of resistance training.
Maybe I'm wrong here though, and I should be supplementing with something.
Another question I had: is it obvious that taking a supplement pill would be the same as getting more sunlight? I don't understand the physiology at all. But it seems plausible to me that getting vitamin d from natural sunlight might do more than taking a vitamin d pill.
A related, but slightly different point is that it's healthier to be outside than in a tight, confined space, potentially inhaling viruses from other people. And it's also healthier to be active. Vitamin D can be a proxy for both of these things.
The other thing to talk about here is skin cancer. In Australia, which has a huge population of people whose ancestors lived in Britain, skin cancer is a huge problem.
One worry I have with vitamin D supplementation is... If your body gets used to you taking a supplement, maybe your body will react by synthesizing less of it? so, if you start the supplements, you'll have to take them forever, or have some negative effect when you stop?
The active hormone form of Vitamin D is two reactions down from D3. Your body cares about the concentration of C, it produces C by A --> B and B -->C, with any excess of A dealt with by the kidneys like anything else. The diet/exposure/supplement goal is making sure you have sufficient A to feed the two reactions. You probably aren't affecting the regulation of the metabolite reactions by getting more A from a supplement.
I think a useful comparison here is the Inuit. They live further north than almost anyone in Europe, and have noticeably darker skin.
The story I heard, somewhere on the Internet, was that the main sources of vitamin D for humans are sunlight and meat, and lowered melanin (lighter skin, and maybe some hair effects) evolves when there's a population in higher latitudes that almost entirely eats plants. Historically, this pretty much meant settled peasant farmers, the main examples being Europe (especially around the Baltic Sea) and parts of China. Populations in higher latitudes that eat a lot of meat, don't *need* to evolve lower melanin levels, but there's no evolutionary pressure to have high levels either, so it's subject to drift.
But I have no idea how accurate that is.
Not just the Inuit. The Mongols. They lived at similar latitudes to Russians or Germans. Only they considered semi-starved agricultural laborers in China a crime against humanity and put their fields to pasture.
In 1972 my physical anthropology prof flatly stated that darker skin selected against skin cancer and lighter against rickets—and that explains the relationship between skin color and distance from the equator—particularly in cloudy Europe. I don’t see rickets mentioned.
Maybe I'm a dopey layman, but didn't hunter-gatherers in temperate climates wear shirts? I'll go so far as to suggest that they didn't wear Abercrombie & Fitch [citation needed], but the things that live out in the sun, like flies, brambles, and pole cats, all can mess up your skin and introduce disease. That's not a modern realization.
100k IU/d for a week completely resolves my colitis episodes, from 30 stools with nothing but blood to no symptoms, so make of that what you will.
You say:
"Surprisingly, average American levels seem about as high. The nationwide average is about 27 mg/nl, but black people (whose dark skin blocks sunlight) are almost all insufficient and bring down the average; for whites, it’s about 30 ng/ml. Why are these levels as high as some of the farmer-specific studies elsewhere? Maybe it’s Americans’ better nutrition - or maybe it’s that lots of Americans already take Vitamin D supplements. Canadians are close behind at 26 ng/ml; they fail to break this down by season but I’m guessing it was in the summer."
Because all of our milk is Vitamin D fortified, I strongly suspect it's the milk keeping most americans relatively high. On top of the reduced effect of sun, blacks are much more likely to be lactose intolerant. It would be interesting to see numbers separated by latitude and milk consumption.
I do wonder if the effects of vitamin D are secondary effects. If you have issues caused by D deficiency, fixing that allows your body to repurpose other resources towards other issues. Kind of like the anti-parasite drug that seemed to help against covid... in areas which had a higher prevalence of parasites.
Yup. Vitamin D fortification in the US is a good thing, and also does a poor job of reaching black people.
Personally I take 10,000 IUs, and it seems to help both me, and everybody I have recommended it to.
However - I, and everybody I have recommended it to, also happen to be lactose intolerant office workers who get very little sunlight. Social bubbles, I guess? Dunno why lactose intolerance is so insanely common in my social bubbles, though.
Additionally, the observation pattern is not "This helps me", it's "It's not the vitamin D, I'm just having a better week than normal. Oh. It happened again. Oh god. Oh god I'm a bag of chemicals."
My guess would be that 10,000 IUs is too low to be directly harmful, while also being sufficiently high to quickly resolve long-term severe deficiencies in a timeframe short enough to be personally observable.
Given social bubbles, I don't think "General population who is at normal or slightly insufficient levels" is the correct reference class. Nor should vitamin D be considered a "normal" vitamin when considering deficiencies; the cluster of people who benefit from it are going to benefit from a significantly higher dosage than you'd expect looking at the average population.
If you're lactose intolerant, then that's a dietary source you are not able to utilise for Vitamin D. Supplementing is therefore useful in that case.
Weird. I don't think I know anyone who is lactose intolerant (or at least know them well enough for it to have been mentioned to me).
Most white people are lactose tolerant. Most non-white people aren't.
I think this is misleading. Most children aren't lactose tolerant by and large, they become slightly lactose intolerant as adults because they stop ingesting dairy regularly. Adults that continued regular ingestion of dairy mostly don't have issues.
Similar patterns happen with other adaptations too. If you go on a keto diet for a long time and then start eating carbs again, you'll have serious bloating and gastrointestinal issues for awhile too.
We've identified the location of the gene that is responsible, and found mutations associated with lactase persistence. It is genetic, and primarily so; some people do maintain the ability to digest lactase without the mutation, but most do not.
What race are you?
Lactose intolerance is uncommon amongst white people, but in every other racial group it is the norm. If you work with a lot of Asians, black people, or Native Americans, you're likely to know a lot of lactose intolerant people.
Glad Scott mentioned obesity, because a lot of the discussion of Vitamin D I've seen omits what seems to be important differences in vitamin absorption in obese people. It's a harder topic to talk about due to PC, but IMO should be noted whenever statistics about what percentage of Americans are Vitamin D deficient are thrown around. People who at a healthy weight are at a lower risk of deficiency than the general population and should supplement less. (Although de facto, just as vaccinated people are more likely to wear masks, I expect that non-obese people are more likely to take Vitamin D.)
(Also—if you're supplementing D because you get outside very rarely, consider changing that?)
Having seen crippling adult asthma mostly cured through high dose Vitamin D, I will persist in believing that there is something besides calcium regulation going on.
As for what constitutes Paleo Vitamin D levels, look at some National Geographic magazines from a century ago. Many warm weather people wear considerably more clothes these days. Conversely, Americans wear considerably less than they did a century ago.
wow: as Biologist (me) instead of focusing on evolutionary "theory" and Vitamin D dosing it would be more appropriate to look up modern data which is available on our (humans) ability to metabolize and use Vitamin D on a daily basis. Vitamin K is in its own category because of its relationship to blood coagulation, so that should be medically evaluated. Vitamin C has many studies, along with Zinc which show that it does boost immune responses. People think that Vitamin dosing is a benign response but I have personally known people who had some bipolar issues and other symptoms who overdosed on Vitamins giving them diarrhea, cramps, etc. A lot of clinical data on modern daily doses of all Vitamins is available. Still a good article. :) lots of work too. more than what I would do :) have a great day! '-
Charts like that make me despair for medical science.
Chemicals on the giant chart of metabolic pathways are kind of like religions. For any chemical with a large enough biological role, there is going to be some group of people exhibiting cult-like behavior, who are absolutely convinced that it is the One True Chemical that is the ultimate solution to all disease.
Frustratingly, some of these groups are probably correct, in the same way that on the day the world goes up in flames in a nuclear apocalypse, there is bound to be some conspiracy theorist somewhere in the world who predicted it correctly.
Hehe, I feel called out, given I say I belong to the Church of Cobalamin. :) (That's vitamin B12.)
I do at least try to curb my enthusiasm, and/or plaster it with disclaimers that This One Weird Trick may not work for you at all. But there's something nice about being able to recommend people just give it a shot because it's over-the-counter and it's one of the few vitamins where overdosing isn't really an issue. "Are you having trouble sleeping? Inexplicably sensitive to sound and light? Depressed? Troubles thinking straight if someone talks nearby? Try vitamin B12 t o d a y!" It's had a ~20% success rate so far, which is good enough that I intend to keep recommending it (if the symptoms I expect to see match enough). But it is *definitely* my hammer where lots of things suddenly look like nails! ;) Guilty as charged.
(I don't think it's gonna cure COVID-19, though, just to be clear, nor any other pathogen-caused disease.)
I do think that I'm sure I spend less time outside than my father (who grew up on a farm) and he was outside less than his father (lived on farm his entire life) and him less than every other ancestor before him (worked on farms using only horse power). In times before sunblock or sunglasses were invented.
Does this mean my vitamin D levels are lower than any ancestor? Is this why I'm so near-sighted?
"Hunter-gatherers in the environment where most of our evolution happened might have been outside all day shirtless. On average the sun's halfway from peak, so that might be equivalent to 8 hours of peak sunlight at the equator."
Okay, from the start I am going to dispute this. We've been told hunter-gatherers had this idyllic ancestral lifestyle where they got all their needs met in a few hours and had the rest of the day for leisure time. They weren't forced to be out toiling in the fields under the blazing sun for hours every day.
If they're walking around shirtless under the equatorial sun for eight hours a day, then they are stupider than lions:
"Lions are most active during dawn, dusk and periodically throughout the night. During daylight hours they can be found lounging or sleeping in the shade."
If our shirtless hunter-gatherer can't figure out "get under a bush, stay in the shade, don't move around too much, and drink water", then he has more problems going on than "boy, I must be generating *so* much Vitamin D right now!" can solve.
After all it's just mad dogs and Englishmen who go out in the mid-day sun:
https://www.youtube.com/watch?v=BifLPGi4X6A
me and everyone in my family is vitamin D deficient in the winter (dark skin + live in UK), and the main complaint we had that got fixed by taking vitamin D was extreme fatigue. We went to the doctor about tiredness, got blood tests and then got given the 20,000 IU or 50,000IU pills. At no point did anything about bones come up when discussing it with the doctor for me (can't speak about everyone else).
So i'm surprised that you didn't even MENTION it.
So many paragraphs on D without hitting the most important points. You hold up the IOM (now renamed the NAM) as more prestigious than the Endocrine Society despite their statistical mistake and the fact that the Endocrine society has 18,000 members (in "medicine, molecular and cellular biology, biochemistry, physiology, genetics, immunology, education, industry, and allied health" according to Wikipedia). Okay. The IOM's own minimum serum level recommendation for 25(OH)D is 20ng/ml. This is also the target min set by the European Food Safety Authority, Germany, Austria, Switzerland, Nordic Countries, Australia, & New Zealand. It was also the consensus rec of 11 international orgs (see https://academic.oup.com/jcem/article/101/2/394/2810292?login=false). Despite this consensus roughly 50% of humans globally do not achieve this level. Calculations & citations for the 3-4 papers involved can be found at: https://twitter.com/KarlPfleger/status/1390775110257102848 (I welcome corrections). Why don't you see this as a big problem? Even if only for global bone health. For the Endocrine Society's recommended 30ng/ml minimum, roughly 3/4 of people globally are too low. Note also that this higher 30ng/ml is the level at which typical blood tests from Quest or LabCorp in the US are flagged as low. What governments are making concerted efforts to drive down these deficiency rates? 25(OH)D tests are inexpensive & widely available. The normal standard is that RDAs are set so that 2.5% or fewer people are deficient. The US estimates typically come in at 25-35% or sometimes 40% at the 20ng/ml level. That's 10x the deficiency prevalence max target. What US government agency is responsible for reducing these deficiency numbers?
And while those deficiency %s were for populations as a whole, looking at racial breakdowns also compels action. As noted by Ames, Grant, & Willett in https://www.mdpi.com/2072-6643/13/2/499 dark skinned minorities have it much worse. In the US 75% of blacks are clinically deficient & 96% are below the 30ng/ml level recommended by the Endocrine Soc. It's not absolutely proven that this explains part of their worse health outcomes but it is clearly the conclusion of these authors (who are respectively pillars of nutrition research, vitamin D research, and clinical medical research) that it is an important factor, and certainly not proven that it isn't. This is clearly a racial issue. These are absurdly unacceptable deficiency prevalences. Why aren't public health groups & government agencies trying harder to help these groups?
You both downplay the importance of the observational studies too much. Much of science and many important clinical medical decisions are & should be based on observed correlations together with understanding of the basic underlying science (physics/chemistry/biology). Astronomy and other scientific fields make progress without the use of any RCTs. Seatbelts, parachutes, and smoking warnings on cigarettes are all justified without recourse to RCTs.
But now let's talk about Covid risk factors. Age, obesity/overweight, and comorbidities like diabetes are established Covid risk factors that no one questions. All of these are based exclusively on observational data. No RCTs are part of establishing these as legitimate risk factors. Vitamin D status, ie 25(OH)D, is also unquestionably at this point a statistically significant risk factor for Covid. 75+ studies with aggregate ~2M subjects. Multiple meta-analyses. Narrow confidence intervals. And effect size of 1.5-2x different risk, mostly based on the 30ng/ml threshold or the 20ng/ml threshold. (See https://twitter.com/KarlPfleger/status/1486565564671692804 for citations.) This is not just important for hypothesis generation for therapeutic potential, it's also important for stratifying the absolute risk level for groups or individuals which is how one clinical computes Number Needed to Treat (NNT) for helping to decide on relative benefit vs risk of potential interventions, such as vaccination or use of anti-viral drugs.
Meantime, no evidence suggests that being vitamin D deficient is protective for Covid. And known immune biology suggests multiple clear mechanisms of action by which D should be protective. (See eg https://asbmr.onlinelibrary.wiley.com/doi/full/10.1002/jbm4.10405 but more mechanism papers in the previously linked Twitter thread). So, that leaves a pretty clear-cut benefit vs risk analysis:
If governments / public health officials emphasize reducing deficiency for pandemic control & end up being wrong about D helping Covid, the biggest side-effect would be reduction in the huge prevalence of deficiency (see other comment I just made here), w/ consequent improvements in population wide bone health, and probably autoimmune health, & resistance to other ARIs.
Or conversely, if officials recommend a concerted effort to reduce deficiency for the non-Covid benefits, reduced Covid transmission, hospital burden, & deaths are all plausible side effects even if not guaranteed. But a worse pandemic is not. I find your overall stance puzzling given the apparent imbalance when viewing things this way.
Lastly, (this is the 3rd of 3 top level comments I'm making here) let's talk about scientific hypotheses and the proper ways to test them. The massive observational data showing correlation between 25(OH)D and increased Covid risk suggests more than anything else a specific form of clinical intervention: Give deficient people vitamin D until they have enough (eg for example those starting at <20ng/ml until they have >= 30ng/ml). This is in fact exactly the form of intervention long advocated by vitamin D researchers such as Heaney: https://academic.oup.com/nutritionreviews/article/72/1/48/1933554 and Grant et al: https://www.sciencedirect.com/science/article/abs/pii/S0960076017302236?via%3Dihub
The intervention give a fixed amount of D to people for whom one hasn't tested baseline D levels, or the intervention to give a fixed amount to people after testing their levels but without testing to see that they have achieved sufficient levels before beginning to record adverse event differences vs control are not correct forms of study design to properly test the main hypothesis. And in fact these problems are the main reasons why many vitamin D RCTs fail. See for example https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7487184/
Given the massive observational data, the many clear mechanisms of action, and the fact that no pharma company will fund a study of an inexpensive supplement, it is puzzling that governments have not adequately funded a properly designed study to test the hypothesis properly. To this day there is no RCT testing the use of the intervention raise 25(OH)D from 20 to 30ng/ml powered to adequately rule of the hypothesis that it would reduce severe Covid outcomes like death & ICU, yet people (including this piece) keep (inappropriately in my opinion) casting doubt on that hypothesis despite compelling biology.
The observational data is solid. It doesn't prove causality, but it's suggestive of such a strong potential benefit, that it should be someone's responsibility to disprove the causality by finding the confounding shared causal variable that fully explains the causation. Until then it's reasonable to apply precautionary principle and try to ensure people are not deficient.
I remember people making similar arguments -- can't hurt! might help! precautionary principle -- about Vitamin E about 25 years ago.
Turns out, when the careful work was done[1], Vitamin E supplementation not only doesn't help in one of its target disorders (heart disease), it actually boosts the risk of heart failure. So significant supplementation does active and measureable harm. Oopsie!
-----
[1] Popular report: https://www.webmd.com/food-recipes/news/20050315/vitamin-e-harms-more-than-helps
This is a false analogy because the hypothesis about vitamin E back then was primarily that it just helped regardless of status, not that a huge proportion of society was deficient. The precautionary principle is much more appropriate to use in today's vitamin D case precisely because there is ~50% global deficiency (at the government set recommended level) and ~75% insufficiency at the Endocrine Society set min level, and the relevant hypothesis suggested by the massive observational data is that fixing the too-low levels is what will provide benefit. We know that giving more and more of most things doesn't continue to provide increasing benefit forever so it was not entirely reasonable to say that more E regardless of baseline was ideal use of precautionary principle. It's much harder to imagine that fixing the too-low D levels will cause harm than blindly giving more E. There's no known benefit to being vitamin D deficient, especially for Covid. So precautionary principle is much more justified in this case.
I remember reading an article back in the 1990s or early 2000s in Natural History (AMNH's member magazine) about vitamin D and folic acid. It pointed out that increased sunlight raises vitamin D levels but lowers folic acid levels, so there is a tradeoff between getting enough of each. This suggests that there is probably a limit on how much D a person can acquire from sun exposure without running into a folic acid shortage.
(The authors also argued that this accounts for women having slightly lighter skin than men as a result of their having a different balance point for the two chemicals. Is this even true? I have no idea. I can think of dozens of confounding factors, but that's a discussion for another time and place.)
Is it that hard to believe that Vitamin D is a boring bone and immune system chemical, as opposed to just a boring bone chemical?
The idea that it can "treat COVID, prevent cancer, prevent cardiovascular disease, and lower all-cause mortality" sounds ridiculous when described like that, largely because it implies that those are all separate things that Vitamin D just happens to fix. But really, it's all one simple (and thus not especially unlikely) effect: It improves the immune system's functioning, and anything that improves the immune system's functioning is going to make the body more capable of dealing with illnesses in general, including COVID, cancer, and probably some forms of heart disease (specifically, whichever types are caused by pathogens). Obviously it's not a miracle cure for any of those things - it's not guaranteed to prevent you from getting them, nor to make them go away once you have them. But it can make your body a little better at dealing with them.
"Improves the immune system" is just hiding the magic underneath another anodyne phrase. What do you mean, specifically, by "improve the immune system?" There are plenty of drugs one can imagine that interfere with this signalling pathway, or cofactors required for this particular cascade, et cetera -- but these are all highly specific causes and results. It would be unusual to find one compound that has such broad activity, across the hundreds to thousands of pathways involved in immune response, such that one can only really describe it by saying it improves the immune system.
That, I think, is the point. If one small molecule can have such very broad positive results, it follows, looking at it from the other direction, that this one molecule is single point of failure for an entire critical physiological system: not enough Vitamin D and *everything* goes to hell. That is not in general how physiology works (nor would it be a good engineering design for a robust system). Usually there are backups and secondary pathways and other ways of stuff getting done for almost any major physiological system. We can even do without O2 for very short periods.
That's not to say Vitamin D necessarily *doesn't* play as central and critical a role as would be required for it to have these very broad effects, but it's a proposition that one would quite reasonably view with skepticism, on the grounds that it seems dubious Mother Nature (or God the Creator if one prefers) would build a system with such inherent design fragility.
Granted, I'm not an immunologist, and I don't claim to understand the mechanisms through which a chemical might conceivably "improve" the immune system, or even the mechanisms through which the immune system works in the first place. For that matter, I don't really understand what mechanisms would cause a "boring bone chemical" to improve bone quality, or how a substance might conceivably improve circulatory health or digestive functioning or neurochemical balance or any number of other things that various substances are capable of doing. Is there any reason to think that it's *less* likely for a substance to be able to affect immune response than any other particular "system" in the body?
Not at all. I'm just saying (1) almost no normal nutrient has any significant effect unless you're suffering from serious deficiency, on account of the body normally has about eight different pathways for doing the same thing as backups to mild deficiency -- and, I mean, thank God, that's why we can take statins without killing ourselves, because although we interfere with one major pathway for building steroids (the one that leads to cholesterol), there are backups and workarounds to build those we can't possibly do without; and (2) if you are looking for *positive* effects (improvements above normal-baseline-healthy), you would expect those to be rather specific, what we expect from most therapeutic drugs, e.g. you wouldn't expect Celexa to improve your kidney and immune function as well as your mood, and you won't expect your BP meds to make you better able to focus and more resistant to winter flu.
But statins do end up having a major effect, don't they- so these natural redundancies don't end up replicating statins' effects. Which kind of refutes the objection, doesn't it?
> Our priors on a random chemical doing that have to be less than 1%, or we get caught in weird logical traps.
Maybe not all of those at once, but what are the chances that a chemical does any one of those? If it's less than 5%, then that means that, given some random chemical passes a well-done controlled randomized trial, it probably doesn't work.
On the importance of vitamin D: we dropped our skin pigmentation on our way out of Afrika in evolutionary record time, whatever the reason was it apparently changed either mortality or reproduction success dramatically. And it's not unreasonable to assume that this is vitamin D, or that vitamin D plays an important role in it. Not sure how this can be proven though, its just a hypothesis.
Regarding the minimal/optimal levels: i know the Hadza studies, but there are much more .. I remember a few from India and Italy that look at people working outside, with less or little clothing like farmers, construction workers, and they all come out at levels between 40 and 60ng/ml as the "normal/optimal" range. But of course there are plenty of studies pointing in the other direction considering 40ng/ml as way to high, the official recommendation here in germany was for ages "nobody needs any supplements, including vitamin d, you get all you need from a balanced diet!". Exception of the rule: babies in their first year get 400IU here because this dramatically lowers chances of rickets.
And this is the point which made me overthink my position a few years ago: the body of a pregnant or nursing mother makes sure the embryo/baby gets everything it needs, all the stores of the female body are depleted, if necessary even more. Except for vitamin D, which is often minimal in human milk. Unless, and this has been tested clinically, the vitamin d in the mother reaches around 40ng/ml 25(OH)D or above, then the milk contains enough vitamin, no supplementation necessary.
There were several studies looking at this, small/medium RCTs if i remember correctly, with supplementation in the range of 4000-6500IU daily, and beside the milk issue they found effects like 20% lower probability of birth complications and other things improving. I really should dig out these studies again .. I haven't bookmarked them.
The only one that made it into my "google keep" is this one: "New insights into the vitamin D requirements during pregnancy" https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5573964 which is an overview of the state of things during pregnancy, with over 140 references
PS: bear with me, english is not my first language ;-)
Adding more anecdotal evidence to the pile: I have fibromyalgia and have been dealing with flareups for approximately my whole life, without knowing what they were (I wasn't certain I was chronically ill until 2020). My worst episode was in mid-2019, where for nearly three months I was so fatigued that there was a noticeable drop in my work quality, and I had to start working from home because I couldn't make the ten minute walk to the train station. I was essentially bed-ridden for part of the time, and I had at least one day where I couldn't reliably speak or focus my eyes.
Doctors did a full blood panel, and the only unusual thing they found was that my vitamin D was at 24.8 ng/mL, considered insufficient but not deficient. I started taking 10,000 IU daily, and the months-long episode cleared up within a week or two. I've been taking 5,000 IU daily for the past two years and my rate of flareups — which previously happened every couple months, unpredictably — has dropped to basically zero. I also talked to a fibromyalgia specialist (who was maybe a bit of a quack, but ¯\_(ツ)_/¯) who said she recommends all her patients take 5,000-10,000 IU daily — not that I trust much of anything she said, but it did feel like it backed up the idea that I was not totally crazy to think that supplementing vitamin D was helping.
My current hypothesis or story of Vitamin D is this in short: Vitamin D helps controlling and adapting our metabolismisms to summer / winter cycles. Nearly every cell in our bodies has receptors for Vitamin D.
The farer a region is from the equator the less sun and energy reaches the ground during the winter and the less food is available for animals in the end. Therefore the evolution invented hybernation. This is a special 'low activity low power consumption mode which helps to survive hard, long Winters without food. In autumn many animals prepare for the coming winter by eating es much as they can and store a lot of additional energy as fat in their bodies. This fat is then slowly converted into energy during winter while many of these animals just sleep most of the time. Research about hybernation seems pretty new. Only recently researchers found hybernation mechanisms in animals which were not known for that. Mongolian wild horses which move way less and reduce their body temperature for example.
If I look at all the random facts I learned the past 20 years I see a pattern: I have more apatite in dark times, I get fat in autumn and winter, I eat less in Summer, I'm depressed when it's darker, I'm happy when it's light, in winter I move way less compared to summer, Vitamin D, SSRI, Serotonine, Melatonine, sleep, and mood/motivation/missing motivation are related to each other asf
Not sure , where I heard first about the Hybernation-Hypothesis first but I cannot un-learn it anymore.
TL;DR
If I look at all those strange things happening with my body though the 'Evolution-Hybernation-Lens' it seems to make a lot more sense for me.
Just remembering random facts I've collected over the years: D3 is animals, D2 is plants. Both need to adapt as good as they can and survive hard long Winters without enough food. Hybernation and low energy consumption in the live-and-death context means to shut down every non-essential function and only maintain the body over the next 5 to 8 months with the lowest possible energy consumption. Bone growth might be luxury, an overly active immune system might by a luxury, moving, exploring is definitely a luxury. I would consume the least amount of energy when I would sleep all the time.
Our son has celiac disease and has blood tests every 6 months. Just for fun, the gastro doctor said to throw in some extra tests last time around including Vitamin D. It came out very low. We were mortified, and pledged to start supplementing straight away. She told us we could supplement if we wanted, because it's practically free after all, but that every single child she had ever tested came out low, often even lower, and it was nothing to really worry about (incidentally, while our son's symptoms are pretty out there, she has seen it all, including paralysis, fits, and psychosis as a result of celiac).
Anyway, our son has a whole suite of symptoms that occur every time he has a a 'glutening' caused by putting his fingers in his mouth after touching a crumb on the bus, or the like. You can set your watch to it. This happens roughly every three weeks and while they are not life-threatening they are sufficiently serious - both physically and mentally - that we have organised our life for years around minimising them.
Two weeks after we started supplementing, a glutening occurred. All the same symptoms, but much milder. A month later, even milder. A month later, even milder. So mild that, if we weren't so attuned to them, we wouldn't even notice them as part of a pattern. After years of fending off doctors and teachers telling us he needed Ritalin - and making faces at each other when we said his mood swings and hyperactivity were a side-effect of exposure to gluten - we found the cure. Our family doctor - who by the standards of the profession is actually pretty good - made a vague effort to feign interest before telling us not to rule out Ritalin.
A general programme of promoting vitamin D supplementation would, at the bare minimum, save tens of thousands of children from dangerous medication and essentially abusive treatment regimens for hyperactivity. All for perhaps 0.00000000000000001% of the costs of Covid policy. Whatever the actual mechanisms involved (and I lean to a Eugyppius-Yarvin type model rather than the Alex Jones version), the medical industry is functionally a criminal entity. Analyses like this one that don't rest on an understanding of the fundamentally criminal nature of the medical industry in selecting and interpreting data are literally worse than useless.
Scott writes of his priors. Mine—however anecdotal—are as follows:
About a decade ago, having moved about a decade previous to that from southern Ontario to southern England, I had a very bad winter health-wise: instead of a handful of colds and perhaps one illness bad enough to raise a fever, I had four or five bouts of illness with fever and in between I would never fully recover, remaining congested with a persistent cough.
In the decade preceding that winter I had tried taking vitamin C supplements for immune health but had stopped because they did not have any discernible effect, but around this time I read a popular press article written by a prison doctor (probably here, though URL is defunct not archived by archive.org: https://www.medicalnewstoday.com/articles/51913) who had noted that vitamin D supplementation had dramatically improved the health of the prisoners in his care, so, living in northern Europe and spending most of my time indoors (and despite being quite fair-skinned) I figured it was worth a try.
For the next two winters I took around 2000 IU of vitamin D3 every day, and had nary a sniffle. Unfortunately, however it was difficult at the time to get even modestly-high-dose vitamin D tablets in the UK (the RDI here at the time was something laughable like 40 IU/day) and eventually my imported supply ran out, so the folowing winter I took none and was terribly sickly again—perhaps not as bad as that first terrible winter, but much worse than I had been the two intervening years.
Since then I have managed to keep myself supplied and have resumed enjoying generally good winter health. Perhaps it's just the placebo effect—though a placebo which proved highly effective against COVID too, as it happens—but good enough to be in the "definitely worth it" category for me.
If hydroxymethylbilane supplementation cures anything, it'll be vampirism.
As the Luxwolda 2012 paper shows, the semi-nomadic pastoral Maasai tribe have a mean average level of 48 ng/ml, with the largest subset of the group (40%) having a level between 48 and 60 ng/ml, and the third largest subset (~12%) having even higher levels between 60 and 70 ng/ml. Even including the hunter-gatherer Hadzabe tribe, the largest subset is still the 48-60 ng/ml group at 33.3%. So the median (which is a more representative value of the norm than the average) will be in this range, higher than the 44 ng/ml mean level stated in this blog post, and so higher vitamin D intake will be required to reach the median value.
Vitamin d seems to have some immunomodulatory effect that is beneficial for some autoimmune conditions like the ulcerative colitis I've lived with for the past 15 years. Plus some meta-analysis said it reduced colorectal cancer risk by half and calcium absorption in the colon is plausibly related to that. If it were just "sick people go outside less" there wouldn't be a big signal for colon cancer in particular. On the downside, chronically high but non-toxic levels of vitamin D might accelerate the calcification of the pineal gland, causing melatonin deficiency, causing poor sleep, causing a variety of other health problems. I basically can't sleep well without completely blacking out all sources of light -- streetlights through blinds are way too much light for me.
If the etiology of death by covid is mostly the immune system overreacting and attacking the lungs, and we have other evidence of vitamin D being helpful in autoimmune conditions, our prior for vitamin D benefitting covid patients should not be that low. Dexamethasone reduced covid mortality by half just by blunting the immune system's overreaction against the lungs. Vitamin D is probably not as potent as dexamethasone but more study is warranted. My prior on any sort of antinflammatory or immunomodulatory benefitting covid is at least an OOM higher than my prior on dart-chemicals benefitting covid.
Some studies have supported much higher Vitamin D blood concentrations:
83.4 nmol/L : Vitamin D and mortality: Individual participant data meta-analysis of standardized 25-hydroxyvitamin D in 26916 individuals from a European consortium, by Gaksch, et al., PLOS ONE | DOI:10.1371/journal.pone.0170791 February 16, 2017
> 50 nmol/L: Vitamin D status and epigenetic-based mortality risk score: strong independent and joint prediction of all-cause mortality in a population-based cohort study, by Gao et al. Clinical Epigenetics (2018) 10:84 https://doi.org/10.1186/s13148-018-0515-y
~35 nmol/L: Evidence for a U-Shaped Relationship Between Prehospital Vitamin D Status and Mortality: A Cohort Study, by Sadeq et al., J Clin Endocrinol Metab, April 2014, 99(4):1461–1469, doi: 10.1210/jc.2013-3481
77.5 nmol/L: Vitamin D deficiency and mortality risk in the general population: a meta-analysis of prospective cohort studies, by Zittermann, et al., Am J Clin Nutr 2012;95:91–100 doi: 10.3945/ajcn.111.014779
75 nmol/L: Commentary: Additional strong evidence that optimal serum 25- hydroxyvitamin D levels are at least 75 nmol/l, by WB Grant, International Journal of Epidemiology 2011;40:1005–1007
and finally the big kahuna of them all:
110 nmol/L: An estimate of the global reduction in mortality rates through doubling
vitamin D levels, by WB Grant, et al., European Journal of Clinical Nutrition (2011) 65, 1016–1026
But wait! There's more! Namely, magnesium!
Magnesium, vitamin D status and mortality: results from US National Health and Nutrition Examination Survey (NHANES) 2001 to 2006 and NHANES III, by Deng et al., BMC Medicine 2013 11:187. doi:10.1186/1741-7015-11-187
Finally, rather than being a boring bone chemical, there have been some studies on the relationship between Vitamin D levels and Depression:
Vitamin D deficiency and depression in adults: systematic review and meta-analysis, by Anglin et al., The British Journal of Psychiatry (2013) 202, 100–107. doi: 10.1192/bjp.bp.111.106666
"One case–control study, ten cross-sectional studies and three cohort studies with a total of 31 424 participants were analysed. Lower vitamin D levels were found in people with depression compared with controls (SMD = 0.60, 95% CI 0.23–0.97) and there was an increased odds ratio of depression for the lowest v. highest vitamin D categories in the cross-sectional studies (OR = 1.31, 95% CI 1.0–1.71). The cohort studies showed a significantly increased hazard ratio of depression for the lowest v. highest vitamin D categories (HR = 2.21, 95% CI 1.40–3.49)."
Vitamin D Supplementation Affects the Beck Depression Inventory, Insulin Resistance, and Biomarkers of Oxidative Stress in Patients with Major Depressive Disorder: A Randomized, Controlled Clinical Trial, by Sepehrmanesh et al., The Journal of Nutrition November 25, 2015; doi:10.3945/jn.115.218883.
" Baseline concentrations of mean serum 25-hydroxyvitamin D were significantly different between the 2 groups (9.2 6 6.0 and 13.6 6 7.9 mg/L in the placebo and control groups, respectively, P = 0.02). After 8 wk of intervention, changes in serum 25-hydroxyvitamin D concentrations were significantly greater in the vitamin D group (+20.4 mg/L) than in the placebo group (20.9 mg/L, P < 0.001). A trend toward a greater decrease in the BDI was observed in the vitamin D group than in the placebo group (28.0 and 23.3, respectively, P = 0.06). Changes in serum insulin (23.6 compared with +2.9 mIU/mL, P = 0.02), estimated homeostasis model assessment of insulin resistance (21.0 compared with +0.6, P = 0.01), estimated homeostasis model assessment of b cell function (213.9 compared with +10.3, P = 0.03), plasma total antioxidant capacity (+63.1 compared with 223.4 mmol/L, P = 0.04), and glutathione (+170 compared with 2213 mmol/L, P = 0.04) in the vitamin D group were significantly different from those in the placebo group"
If I recall correctly, there were pretty robust metanalyses showing that Vitamin D supplementation helped recovery from respiratory infections long before COVID. Not cure-all good, but still meaningfully good.