103 Comments
Comment deleted
Expand full comment

One point to the pay issue though, that I don’t see in the comments so far, is the huge growth in administrative positions in all levels of K-12 through university, that sucks money and resources from both teaching and research. If a reasonable amount of that money were available otherwise, the crushing weight might be removed from the teaching/research structure and improve the university experience.

Expand full comment
Jun 6, 2023·edited Jun 6, 2023

But we simply must have an assistant or the assistant dean for title ix compliance or no one will be safe!

One important skill I learned in college is with enough bothering admins and doing paperwork you could accomplish anything.

I got out of one of the main courses of my major and was allowed to TA it, basically just through persistence (and their own stupid enrollment system).

Expand full comment
Comment deleted
Expand full comment

I think it's partly can't, and partly won't.

There's a dynamic, associated with the up-or-out systems but not identical, and similar but not identical to Moloch, where you have a lot of people putting in a lot of work and only a handful get a big reward. I wouldn't precisely call it "Molochian", because the work can be useful in itself. But it's a gamble, and for the ones who get the reward it's worthwhile, but for the ones who don't, it was simply an experience. Success is partly luck and partly skill and you never know how much was either, but if you've tried a few times and failed, you're probably aware that you're not in the top fraction of a percent of skill. So unless you actually enjoy the experience more than other things you could do with your life (like, say, children), it's not worthwhile.

Unlike academia, I don't think the software world would come apart if all but the top 1% avoided this. But that won't happen as long as there are young people who want to push themselves to find what their limits actually are. Which isn't so bad in itself; I mostly object to non-startups acting like they should expect this behavior.

Expand full comment

> Completely exclude research from the main career path and consider imposing outright guild-like rules preventing your hires from even publishing research while they're employed with you, and perhaps for six months afterwards.

Psychology research student here (or, as the lingo goes, "early career researcher", aka "person we can get to do grunt work on the cheap").

Speaking entirely personally, I really wouldn't find this direction appealing. I am working on my PhD because I want to do research. Oh, I enjoy teaching, and would be totally fine with teaching being ~ 50% of what I spend my time doing, but a teaching-only position would be extremely unsatisfying to me, since research is what I'm primarily interested in - expanding human knowledge, learning new things, finding different ways to frame ideas and test theories etc. So being literally banned from doing it would cut the main motivation out, no matter how much I enjoy teaching.

I'm not even especially fussed on "prestigious institution". Oh, I see the benefits - more resources, access to "better" colleagues, etc - but I would rather do research at a less prestigious place than teach at a more prestigious place. Obviously I'm just an N of one, but speaking anecdotally this fits with the people I've worked with. If it helps I was raised solidly middle-class, and faced few serious barriers to an academic career (beyond structural ones that everyone has to deal with, like competition etc).

Expand full comment

I'm a professor (also from an upper middle class background) with a 3-3 load, meaning I have to do some research but not a lot. If I never had to do any more research, I'd be fine with that, but I also really would not want to do a 4-4 teaching load, or teach less prepared students, which would be the consequence of getting a job doing less research in the current system.

So yeah, some of us much prefer teaching to research, but the way the current system is set up makes those positions less prestigious and more difficult. But the proposal is basically trying to recreate prestigious liberal arts schools, like Swarthmore and Williams, where faculty are expected to be excellent teachers, not so much researchers.

Expand full comment
Jun 7, 2023·edited Jun 7, 2023

I mean, I don't think there wouldn't be benefit there - I'm sure there are people who would make that trade.

I'm just not sure that the proposal would "siphon off" enough to fix the system, since the incentives are still borked?

Expand full comment

Based on the way this ended, you should also give a link to Bret's own post he did earlier called "So you want to go to Grad School in the academic Humanities?" https://acoup.blog/2021/10/01/collections-so-you-want-to-go-to-grad-school-in-the-academic-humanities/

It is very similar to the advice given in section 6.

Expand full comment

Academia selects for tolerance of suffering. Seemingly infinite capacity to tolerate suffering.

I got my Physics PhD later in life and to get a postdoc would have meant uprooting the family, moving to a random place on this planet, then giving up on work/life balance for 2-4 years, then most likely getting washed up due to extremely high competition for a permanent position everywhere in the field. The situation is about 10x worse for theorists like me.

Instead I went back to industry to a comfortable living and a 9-5 job as a senior SWE. Not a happy ending, but at least I can tell a good paper from bad after reading the abstract, for fun. And tell if a sensational headline in my area in a pop sci journal/site is complete BS (spoiler: 99% of the time). It is somewhat satisfying.

Expand full comment

I can tell if a sensational headline is complete BS with 99% accuracy without even reading it.

Expand full comment

> But why? Name recognition among other classical historians isn’t that useful; probably every academic classical historian knows every other academic classical historian, it can’t be a big job market.

“Name recognition” isn’t exactly what they’re going for. Rather, every academic wants to hire the people whose work they are constantly using and thinking about, and the academics are in charge of hiring.

Prestige among one’s peers is also a nice thing, so they’re willing to hire the person whose work is useful to many others in the field, even if they personally don’t use it.

Name recognition is only a poor proxy for this kind of prestige, because as you say, most people in a field know most other even moderately prominent people in the same narrowly-enough defined field.

> This is the same question I ask about George Mason. Many people have remarked on how impressive it is that they have Tyler Cowen, Bryan Caplan, Robin Hanson, Garett Jones, etc, despite not being the sort of Ivy League school where you would expect famous people to congregate. The answer has to be that the department is selecting for Devereaux-like people with popular fame rather than academic fame. What tradeoffs are they making here, and have they paid off?

George Mason is an unusual case. I think it might have actually started with their law school going specifically for “law and economics” as their growth subfield. But their economics department very specifically selected for libertarianish heterodox people, who seem to speak to people outside of economics even if they don’t speak to economists. This is often a way that a lower prestige university can jump some rankings in a field, by taking a chance on becoming the specialist in some subfield that isn’t highly respected within the field, but might be in adjacent fields, or outside academia.

Expand full comment

There are high impact people in STEM at low ranked places, but it’s much harder in humanities (by anecdotal reports from my humanities colleagues.)

Humanities is defined by positions and debates and only a few places (Oxbridge, Ivys, Stanford; OIS) really get to set the terms. In STEM I can read a five page paper by anyone and go “wow”; in the humanities it’s a four hundred page monograph and the OIS is a Schelling Point.

It really sucks for people and generates a lot of resentment.

Expand full comment

I've always heard the opposite. It's not too hard for someone at SF State, or Amherst College, to be influential in philosophy (just to name a few examples I can think of off the top of my head). But in the sciences, you're not going to publish influential work if you can't recruit some highly skilled graduate students to work in your lab, and those best grad students aren't going to take a chance on getting to work at the one high-performing lab in a lower-tier school (and they can't take that chance at a school that doesn't have PhD programs).

Expand full comment

My data comes from English/Literature and History. Philosophy (at least analytic philosophy) may have a more STEM-like pattern (which makes sense; you make careers based on papers.)

Expand full comment
Jun 6, 2023·edited Jun 6, 2023

This is anecdata from my own STEM PhD applications a couple of years ago, but...

Every advisor and professor that I spoke to emphasized that, when applying to PhD positions, the lab and advisor were so much more important than the university that you should essentially disregard the latter. Of course, top labs and advisors are often at top universities, but there are sometimes exceptions. I now pass this advice on to new PhD applicants, so at least some of them should be looking for high-quality labs over universities.

> best grad students aren't going to take a chance on getting to work at the one high-performing lab in a lower-tier school

Again, in my experience, if advisors/labs really want a student, they'll provide informal guarantees ahead of time that the student can work with them. I had this happen at one university that I was unsure about attending, and I can't imagine why professors at lower-tier schools wouldn't want to do this for their best grad student applicants. (Of course, informal agreements can always be violated, but I at least believed it to be a sure thing.) If this practice is as widespread as I predict, it'd de-risk accepting offers from lower-ranked unis with great labs. Together, these factors should make it easier to have great labs that attract top students even at lower-ranked universities.

Expand full comment

It's very difficult to get a prospective student to accept an offer from lower-ranked school no matter how much we offer them. The lab matters a lot, but it's unlikely there wouldn't be *any* lab for them at the schools they are accepted at.

Expand full comment

This is correct. I am very open to moving, but I don't want to risk a step down in ranking. The quality of students I can get is one of the most important benefits provided by my university.

Expand full comment

Academic institutions want to hire people they think will bring in lots of grants. Name recognition and prestigious publications are downstream of this.

Expand full comment

That might be what the institution wants. But the institution doesn’t run the hire - the colleagues in the department do. There are things that try to align those interests, and they’re not too far off, but I think the grants aren’t what the hiring department is primarilyearnestly thinking of.

Expand full comment

Right. Bret is a great *teacher* of history, able to take existing research and condense it into a lecture or blog post. What universities want is someone who the next Deveraux will be likely to mention in his list of sources.

Expand full comment

> This is the same question I ask about George Mason. Many people have remarked on how impressive it is that they have Tyler Cowen, Bryan Caplan, Robin Hanson, Garett Jones, etc, despite not being the sort of Ivy League school where you would expect famous people to congregate.

I've mentioned this a few years ago, but I think it bears mentioning again. The standard rankings that people think of, mostly apply to undergraduate education. For graduate departments, every field has its own set of rankings, and different schools will sometimes have distinct specialties or areas of focus. The top undergraduate universities won't have truly *bad* departments in anything, because they need decent teachers and therefore they need decent grad students and therefore they need some decent professors doing research. But those departments might be mediocre backwaters from the perspective of the field. Meanwhile, other universities that aren't competing for the top undergraduate spots may have carved themselves out a niche in a few fields.

For instance, there might be a top undergraduate university that also happens to have a top law school, but, say, their linguistics department is a tiny thing that has a handful of "famous-in-the-field" professors but isn't really on the radar of the field as a whole. Or take the very name of the "Chicago school of economics"; I don't know what the University of Chicago's department of Economics is up to these days, but for a while it was an iconoclastic powerhouse.

The Economics department of George Mason might be going heavily for popularizers, but it also sounds like they have an affinity for approaches that resonate well with the rationalist crowd (if that isn't putting the cart before the horse, given that we're talking about Robin Hanson). And they may also have a political slant that allows them to "moneyball" talented professors who are being discounted by departments at other universities with different political slants.

Expand full comment

In regards to George Mason, they're specifically known for being a center of the more moderate "Hayekian" branch of the Austrian school of economics (which is probably why they have this specific brand of libertarian economics bloggers there), whereas Auburn is known as a center of the more radical "Misesian" branch of Austrian thought, which is more typically associated with paleoconservative or right-libertarian movements.

Expand full comment

Check the timeline: GMU didn't hire popularizers, GMU hires turned out to be popularizers. Why? Presumably because of how they all embraced blogging at separate sites, at roughly same time (when you could really get traction), and built from there. But I'm pretty sure the relevant folk were at GMU before they wrote the popularizing books, started blogging/interviewing, etc. (That is, reading MR, econblog, overcoming nearly two decades ago, they were already at GMU, and didn't yet have much fame/influence.)

Expand full comment

That's interesting, thanks!

Expand full comment

I don’t think the example with junior and senior devs is universal. Frankly, it feels very counterintuitive to me.

I am running a software engineering businesses (as CEO) for 12 years now. My first company was 200 engineers, the current one is 40.

Our best people were always inside talent, juniors we hired just after school (or even during) who learned and grew with the company.

Making a 100% salary raise over a course of single year is not common, but we had some such a cases. However, growing 25+% per year is quite common (100% over 3 years).

I am very confident that for super talented individuals it’s much easier to get to high position if they stay with the company than if they would change the job. Their new employer will see “not enough years in cv” and thus have negative bias, whereas we have insider info about their performance.

We always had a transparent salaries though, so everyone sees everyone’s else salary.

Expand full comment

Presumably you treat your employees well

amazon doesnt

Expand full comment

I agree the junior/senior dev thing doesn't fit my experience either.

In terms of *titles*, my company (not FAANG, but maybe one or two tiers below that) absolutely does hire/promote internally: I started as "junior" (associate) level and have moved up three times - was officially senior after only a year or two and am now a level above that. My current team lead was promoted out of our team's testers when the previous one left.

There are exceptions, I know some people have had gotten stuck in a position due to maintenance concerns - "if you moved to position Y, nobody would know how to maintain system X" (which is obviously not great). But it's not some "we prefer outside candidates because we want to gamble on a superstar".

As for pay, it's a bit more complicated - over the last decade or so, you *are* probably going to maximize your pay by jumping ship as often as you can get away with: there's high demand and you can always use your previous salary to negotiate a new one.

Also, part of the dynamic is that a lot of places you don't give your high performers raises, but instead bonuses: often in terms of stock. (Even my current job is privately traded but still gives out some internal, opaquely-valued equity) This isn't quite as good as a pay-raise - the bonuses aren't guaranteed year-to-year and usually have a "vesting" schedule where you lose them if you leave - but it is a missed element if you just look at numeric salary and not "total compensation".

Expand full comment

"Don't be irreplaceable. If you can't be replaced, you can't be promoted."

This is also consistent with the *inverse* Peter Principle: rather than being promoted until they reach their level of incompetence, the best employees become irreplaceable and get stuck in low-level positions forever. To maximize your chances of being promoted, you have to be bad enough they can't let you stay in your current position yet not bad enough to fire outright; if they can't fire you and they can't let you stay where you are, the only option left is to promote you.

Expand full comment

I think the Software job market discussion mixes several different types of companies and strategies and labeling FAANG non FAANG actually makes this worse as the different FAANG companies represent totally different strategies. There is also a strong advantage for bringing unique knowledge that actually rewards a person with 2-3 years of experience at another company but I will ignore that for the rest of this. I may be wrong on the specific behavior of companies.

A lot of Bay Area tech companies seem to follow what you describe with a lot of internally developed talent, the problem with these tends to be during periods of rapid market price changes where getting a 25% salary increase still means you are paid less than incoming engineers a level below you.

A special case of this is Facebook, Google, Microsoft where stock based compensation forms a large portion of total compensation and because it is usually granted over 4 years, stock performance can matter more than anything else in affecting total compensation (e.g someone who joined facebook in summer 2021 would have a much lower total comp than someone who joined in summer 2022 if they were given equivalent offers), in addition some managers consider total comp when deciding base salary while others will evaluate base separately from stock.

Apple is another special case of Bay Area company where internal promotion is relatively common but they hold down salaries relative to other companies relying on prestige/loyalty to avoid competing for talent (but they also have fantastic stock performance and their decision to avoid overpaying in summer 2021 seems prudent now).

A few companies (Netflix) are very explicit at trying to pay people what they think they are worth in the market and demanding high work output/aggressively firing low performers

Some companies (outsourcing/consulting firms) avoid investing in interviewing or retention and hire lots of people expecting high attrition

Amazon appears to have a unique strategy but maybe someone else can explain it

Expand full comment

I also found that the programming example mismatched with my experience. Especially "Companies rarely promote their own juniors to seniors", which I haven't found true at all. At the companies I know, people get promoted regularly. Also the levels are more numerous—just take a look at levels.fyi for some examples. And I agree with Samuel that long tenures are a good way to reach high levels, because of accumulating that trust inside the company and also building the type of expertise that is hard to learn when hopping regularly and only having short tenures.

It can still be true that the best way to increase salary (in the short term) is to interview. Even for people who really are at the correct level at their company given their skills, they might fit one level higher at another company, or find at least one company who will mistakenly evaluate them as being at a higher level. It's hard to accurately measure people in interviews. Of course the person may not interpret that as their luck and the outcome of randomness given enough interviews, but rather they might conclude that their original company was holding them at an artificially low level and salary.

That said, it is quite common for people to switch companies, be in a similar role and level, and get a solid raise. Part of this is moving to higher paid companies, but I think a bigger part is indeed that raises within a company will typically be a bit behind skill growth and general wage increases in the market. My hypothesis is that this is caused by the price sensitivity differences between the groups. For one reason or another, people at the company have chosen to be there and stay there thus far, so there is some value they gain by being there relative to other places, or some hassle and cost to switching. Compared to outside candidates who aren't yet attached to that company, haven't built their lifestyle around working there, and may have multiple offers to consider without that bias. In aggregate it leads to slower raises within a company. It's the economics of the situation.

Altogether, I don't see many parallels with the academic job market. The one big commonality is that the pay distribution has high variance. But other than that, there isn't much. There isn't a formal two-tier system in tech that people get locked into when they join their first company. There is way more mobility.

Expand full comment

>What tradeoffs are they making here, and have they paid off?

The tradeoff is that an otherwise boring run-of-the-mill woke institution like GMU has a persistent nexus of contrarian controversy. Here's an insider perspective: https://www.themotte.org/post/341/culture-war-roundup-for-the-week/60216

Expand full comment

Interesting, I posted under another comment about GMU's economics department being known as a center of soft-Austrianism, its a shame this post seems to imply they're under attack from their own parent university and its student body over that. Probably another point in favor of the greatest threats to academic/research autonomy being from within, not without, the university system.

Expand full comment

In many ways this sounds really hopeful, doesn't it?

"Colleges really are looking for 'superstars'..."

So there actually is a massive institution dedicated to finding the smartest people? As with any institution, it generates a bunch of nonsense and weird boundary effects, but the point is that there really is a big network of people who are not trying to do anything other than identify great intellectual talent and put it to work? That's exactly what we want.

Expand full comment

It is approximately true that there really is a big network of people who are trying to identify intellectual talent and put it to work. However, being a superstar (succeeding at the practicalities of funding & completing & publishing & publicizing high-profile research) also requires a lot of other skills and personality attributes beyond intellectual talent. And high-profile research is only modestly correlated with lastingly-important research. So it is true that the system does identify some superstars and help them to generate lastingly-important research (woohoo!) ... but the system also fails to reward some people with intellectual talent [because they are less skilled on the social side of things, for example] and some of the superstars are generating high-profile research that is not of lasting importance [because these superstars rely on social influence and/or marketing tricks to establish the influence of their work].

Expand full comment

Name recognition among ordinary people is not well-correlated with "prestige" in academia, which for a long time, though may be somewhat changing, follows what your letter-writers have to say about you when you come up for hiring, tenure, and promotion after tenure. And what letter-writers have to say about you have to do with your prestige reputation amongst the 20 or so peer researchers in your specific specialized subfield, who are those very letter-writers. And they're asked to say how good your latest research is, not what have you done for ordinary people lately. So the person who wrote the buzziest technical papers in the hottest area in classical history (or analytic philosophy, my field) will have a lot more prestige in the hierarchy game than the most famous public intellectual who sells a million trade books. Things are changing somewhat, but only in that some public intellectuals are getting rewarded more than they used to be, the prestige economy amongst trading letters about buzziest specialist research is still the dominant paradigm (maybe it should be?).

Expand full comment

Re: public intellectuals. There’s two kinds. (1) Someone who does major cutting-edge technical work first, then writes some books for the public (Murray Gell-Mann, Paul Krugman, Noam Chomsky, Richard Dawkins, Mary Beard), and (2) Someone who just makes a name writing for the public (Michael Shermer, Sam Harris). Only those in group (1) get to work at Princeton or MIT. At the snootiest of places, public writing is seen as more of a sideline or hobby. (BTW, I love the fact that the academic image for this post is Corpus Christi, Cambridge).

Expand full comment

Gell-Mann wrote one book for the public and I don't think it was that popular (though I did read it). Dawkins is MUCH more of a popularizer vs researcher.

Expand full comment

Tangent about programming in big tech companies!

It's hard to overstate how difficult it is to tell if someone's work is good from the outside. If John's taking forever on a project this could be because he's really bad at coding, or the project could just be unexpectedly complicated. Or, commonly, it could be both! Often the project is *in the present* genuinely really difficult because John *in the past* built the system in a way that made it complicated to extend or modify.

(As an aside, I think this is very common– programming is hard for most people!– and this sort of accidental long-term self-sabotage is the entire reason the 10x programmer meme is real; it's comparing a honda civic with a honda civic that goes 1/10th the speed because someone took a sledgehammer to it.)

Anyway, these situations look identical to management, so the only people who can really judge your skills are your peers on the team, who know enough about your work and its context to tell if someone is productive or terrible. Of course, calling out another programmer on your team as sucking is a dick move. So real information about who is or isn't good doesn't propagate through the system efficiently and mostly gets inferred over time by management as someone accumulates a track record of successes or failures.

Software engineering at tech megacorporations is just Molochian principal-agent problems all the way down, which as Paul Graham observed is why startups can succeed at all– if you can build an environment where genuinely superstar programmers can work to the best of their abilities, where the systems all make sense and cleave reality at its joints, then I 100% totally believe you could run circles around a tech giant even with 1/100th of their money.

Expand full comment

>"I was confused by this - why are they giving some people a big raise after a few years, but not others?"

An alternate way to think about this that I find useful: managers basically pretend that people very rarely leave, little can be done in the rare case someone decides to leave, and no one can predict who leaves when. (this is very self-serving, as it absolves management of responsibility for retention, otherwise there might be political implications in the status game)

This means that you hire essentially under duress to fill staffing holes, and look for the best deal while doing so (i.e. hire lower-cost replacements, i.e. provide that lower-cost replacement a promotion), while long-term retention gets little investment. It just happens to save the company a lot of headline money (we hire people cheaper than our average salary!) while allowing the company to ignore the cost of churn (well, we can't control when someone leaves, that's just the cost of doing business)

This strikes me as generalizing well to academia if you consider "value" to not just be salary but some sort of nebulous "future potential"

Expand full comment
Jun 6, 2023·edited Jun 6, 2023

"But if they’re going for name recognition among ordinary people, why not privilege the historians who have name recognition among ordinary people?"

From my view as a consumer of pop science/pop history, there seem to be two reasons:

(1) Snobbery. So what if the unwashed masses know the name of Gerry Jones off the telly? Does he have the intersectional theoretical post-post-modern cred that will make the other institutes gnash their teeth and go "Why didn't we get him????"

(2) Quality. Often the guys (and girls) who make it into public consciousness *do* do so via "that guy off the telly" recognition. Dumbing down your subject to make entertainment does lead to a drop in quality - I've been terribly snobby myself about David Starkey, even though he is an established historian, due to all the BBC series he's done on the Tudors; his 2008 book about Henry VIII, "Henry: Virtuous Prince" where the chapters all read like television scripts (and may have been intended to lead in to the 2009 Channel 4 show "Henry VIII: The Mind of a Tyrant").

Do Neil deGrasse Tyson's forays onto Twitter do anything other than beclown him? Going popular/populist often seems to lead to a decline in quality,

Brett Devereaux may be known and famous to *us* but that doesn't mean the wider public in general know him, so he's not "Big Name off the telly" either.

EDIT: You do get the rare superstars who *are* big names in their field *and* get public name recognition, like Stephen Hawkings (or Einstein, the literal poster boy for this). But in general, the ones who go on to be "Mr/Dr/Prof Telly Expert" aren't the ones who are the cutting-edge in the field.

Expand full comment

I think being a Big Name on the Telly is something that's very difficult to shoot for; it requires you to either already be very accomplished, like Hawking or Einstein, such that the telly people want to seek you out directly, or to have connections to the telly people, which most people don't have. But being Bret Devereux (or Tyler Cowen), and just writing a very good blog explaining your academic specialty to non-academics who may be interested, doesn't seem very hard -- it seems easier than publishing in a journal, at least, and anybody could at least give it an attempt. Why doesn't that matter?

My guess is that the GMU people are prestigious because they've actually produced very interesting ideas (Hanson) or research (Caplan), which (with all due respect!) might be less true for Devereux. It seems to me like being a successful blogger might well lead to money for the university down the road, if you can attract a wealthy patron, (or even if you can attract more talented people to apply for jobs, who might then get grants the normal way), but this is very uncertain relative to established techniques of getting the university money, remains very rare, and was probably wholly impossible before 2005 or thereabouts, so universities don't consider it.

Expand full comment

"Why doesn't that matter?"

Because as you say: "anybody could at least give it an attempt". 'I write a simplified version of my subject on a blog accessible to the general public who are interested amateurs but not specialists or academics' isn't going to get you much traction. It's nice, but it's not the same at all as getting a paper accepted in one of the journals or being invited to give a presentation at a conference.

Expand full comment
Jun 7, 2023·edited Jun 7, 2023

You would think that someone with Bret Devereaux's name recognition and probable ability to attract students (as well as getting awarded teaching prizes by the students) is someone you would like at _least_ at a steady position at your average academic institution. Sure, maybe Harvard only wants insider superstars, but by definition there's a very limited supply of those. Getting him at a typical salary seems like a no-brainer.

Expand full comment

Devereux's post makes reference to another lecturer at his university who is apparently universally considered excellent (voted best lecturer by students multiple years in a row, highly respected by his department), but notes that that doesn't really matter in an academic career. (I'm more cynical, but I wonder if this isn't just a high-charisma person who gives everyone As, much like how the doctors with the best reviews from patients are just those who tell patients everything they want to hear and prescribe them whatever drugs they want.)

Expand full comment

>I was confused by this - why are they giving some people(programmers) a big raise after a few years, but not others?

Cause there are programmers who spend 100 hours making a button move 1 pixel to the left on a webpage while making 10x slower, while there are others who fundamentally understand how to make a 3d engine run on 90s hardware.

Computers do all the heavy lifting and if they are going to listen to you they start instantly. Knowledge work with a possibly if your good enough you can do things without a communication/paperwork overhead.

The surprising thing is that its only double.

Expand full comment

I understand that in theory the skill differential in programming more than justifies the pay differential, but in practice, is it actually the top 1% genius programmers getting the top 1% salaries?

Expand full comment

No, but id bet its the top 50% getting double after 4 years

Expand full comment

While we're on the topic (if we are) of popular versus academic work, here's a comedian flogging his history book:

https://www.youtube.com/watch?v=Ef_6p_RIcxY

Expand full comment

> One of the things I have long espoused, in the sense of thinking it but literally never telling anyone, is that the United States Congress should establish a system of National Teaching Universities that do exclusively undergrad and non-degree training.

Isn't this approximately the role that small liberal arts colleges fill? By definition, SLACs have a high professor-to-student ratio, and professors do little or no research. The main selling point is that undergrads will have direct contact (and potentially mentorship and project opportunities) with impressive professors. Many of these are high-status and charge high tuition, so they're attractive for professors in many fields (though perhaps not engineering or lab sciences).

High tuition and small class sizes exclude most applicants, of course, so this model doesn't make high-quality teaching widely available. But it *is* a system where professors' main responsibility is to teach.

Expand full comment
Jun 6, 2023·edited Jun 6, 2023

I'll only echo the advice I heard given to someone else by one of the organizers during a hard science career fair during a major conference in that field: If you're a man and your spouse is a woman and you're both academics (Ph.D or postdoc), she should try for academia (if she wants; she could also go for industry if she'd rather.). You should focus on industry. You do not stand a chance for a tenure track position unless you're very exceptional.

How this generalises to non-spousal situations is left as an exercise to the reader.

Expand full comment

I know what you’re getting at, and the effect is there, but it’s weak and it’s shifting and unreliable. It will certainly not push a weak candidate over the edge.

Better advice is: don’t do it unless (1) you have good evidence that you’re exceptional -- e.g., early strong publications, respected by faculty, actually having good ideas that you deliver on; (2) it really matters to you to actualize that potential -- being exceptional is just a capacity; it doesn’t mean it will be easy, and you will have to make sacrifices to do it.

True for either member of the couple.

Expand full comment
deletedJun 6, 2023·edited Jun 6, 2023
Comment deleted
Expand full comment

We don't disagree. The key here is "identically qualified". I agree that, given identical qualifications, yes, there is a strong preference — and indeed, even extra money available from the Dean.

BUT, and this is my rather limited point: it would be a terrible idea, if you were female, to believe that being weak intellectually will enable you to compete with stronger candidates on the basis of gender preference. There's a little flex, obviously, but I have never seen a committee take a second-rate candidate on the basis of gender.

The cost is just too high to the department: you will have that person for seven years, at minimum, and everyone is trying to get the candidate that will help them and their research the most.

(One study that should be done — and that would contradict much of what I'm saying here — is where they attempt to measure the degree of bias, by looking at how preferences vary as qualifications are varied. It would take a little work to set up the experiment, but it's not beyond the mental power of a good economist.)

(In the spirit of anonymous debunking, I'll also say: literally nobody reads the DEI statement. It just doesn't get read. It's a bureaucratic form that people fill out, and I have never even heard it mentioned. Talk about a DEI statement litmus test is misleading, unless you mean the litmus test to be "are you willing to say something anodyne for two pages", in which case it definitely *is* a litmus test.)

To be clear, I'm in STEM and have never served on a humanities committee.

Expand full comment

Immutable characteristics?!? Bite your tongue! If you want to be X, just say you're X, and you are! (Or so all the smart people tell us.)

Expand full comment

Yeah I am white but was born in SE Asia and grew up in a statewide housing project. I always wondered how much extra reaction o would have gotten saying I was Phillipino.

Of course these days seems like being Asian is even worse than being white.

Expand full comment

A lot of this comes down to taking a risk on a new hire. You get someone right out of college or some kind of program, and really have no idea how good or bad they are. They don't have or haven't been able to demonstrate certain skills at all, which includes direct work performance but also sometimes basic things like "shown they can come to work on a consistent basis" or "doesn't flip out at coworkers." The person is dealing with the same phenomenon at every potential employer, so there's not much pressure to pay them a lot coming in. You can't usually lower a pay rate after you hire someone for various legal and psychological reasons. So your new hire comes in at what is honestly too low pay for the type of work they will be doing.

Then, one or two things happens. They either do well or they do poorly, where this can be both hard skills directly related to their job tasks and also soft skills like getting along with the team and communicating well. If they do poorly, you don't want to give them a raise, which means you are reluctant to set up any kind of automatic raises for years in the job. You want to give raises to people based on merit and what they've done for you (and there's also an incentive to not give raises to people who aren't asking for them, since that means less money being spent).

New hires with existing experience don't have this problem because you can review their actual work record and test their actual skills. Someone with five years in the field may have completed difficult job tasks or projects, gotten promotions, or can describe in intricate detail how they do their job. This means an employer can justify a certain pay rate even on initial hire, and know they aren't getting fleeced.

No one ever got in trouble for hiring a great employee with proven skills, even paying them what they're worth. Lots of hiring managers have gotten in trouble for hiring a dud that the company has to now deal with - either through legally risky and personally difficult discipline and firing, or through having an underperformer just stick around and drag the team down.

Expand full comment

Typo, section 1, 3rd bullet: "comment" --> "commitment"

Expand full comment

The comments about looking at the *potential* for new recruits reminded me of something someone said about the valuation of startups.

The idea was that as long as the company is not making a profit, it's valued based on its potential. As soon as it posts a profit, it's valued based on its return on investment. This causes its market cap to take a huge hit, because the potential is always much larger.

So if you don't have a track record, people look at your potential.

As soon as you do stuff, they can look at what you've actually done. Which is always much less golden than their fevered imaginings.

Expand full comment

Related to 1).

Another interesting set of phenomenon is that while academic institutions seem to want to cultivate superstars, they sort of active discourage this in the actual educational track. In undergrad and going into grad school, there seem a much higher value placed in conformity and parroting the right viewpoints, than there is for actual intellectual superstardom.

They would much rather have an “above average” player who does a meticulous job citing things and regurgitates the professors views, than someone who is higher horsepower with actual insights who is a little off kilter.

At least that was my repeated experience. “You were the best student by far in 48 of your 50 classes, but just blew off 2 for no reason due to irresponsibility” is seen as a big downside compared to the second best dude who gets an A in 50 courses but isn’t exceptional.

Which makes sense if you are looking to pump out cogs with high reliability, but doesn’t make a lot of sense if you are looking for superstars/intellectual advancement.

Expand full comment

People are also looking to avoid the bad eggs. You want the highest achiever who isn't going to get you in trouble. Someone who flakes on 4% of their classes is a liability. Over a career, that's a lot of missed assignments, possibly causing huge problems for the organization.

If hiring someone, you also want to avoid someone who abuses their coworkers, has psychological problems, temper tantrums, commits crimes (especially at work), sexually harasses clients, etc. etc. You would much rather hire a boring cog than a brilliant person with amazing insight who occasionally gropes their colleagues. Organizations that try to hire such a person often regret it, even if they can help guide this star employee for a period of time.

Expand full comment
Jun 7, 2023·edited Jun 8, 2023

Yeah but we were talking about an academic world where they supposedly shooting for stars.

Also I would claim it that I actually think in most work contexts, the person who kills on 96% of tasks and blows off 4% is still preferable, more valuable. You just reassign that stuff to a cog. Anyway that has been my experience.

Expand full comment

That's likely true, but I was looking at blowing off two entire classes over a college career (which is not missing a test or a few classes but instead the entire class) as the baseline. Someone who calls off sick once in a while but is otherwise a star performer is almost certainly a great hire. Someone who can fail more than one college course because they just didn't feel like it probably is a real liability.

Expand full comment

I will second Simon’s perspective. Working on personal projects or school work teaches you how to code, but there is a large information gap between students and people actually working in industry. You see your first large codebase at a job. You see the types of tools and conventions that make working on a massive codebase possible at that job. You go through the horror or trying to set up your dev environment at your first job etc.

Juniors do legitimately get at least twice as useful if nor more after their first year or two so their pay naturally doubles. After that it levels off. The relevant question is that it does seem odd that companies are willing to let these quasi seniors go to another company. The answer as provided by commenters is that you only get that %100 pay increase once in your career once you learn what the industry is actually like, but workers don’t understand that. The company is forced to let a quasi senior go who is already integrated into the team in favor of a new quasi senior who has experience at a different company but at least has some experience.

Expand full comment

> workers don’t understand that

What exactly is so difficult to explain here? First, salaries and raises are often private. Second, even if they are not, the explanation is: "After one year of good work, John Doe is promoted from junior developer to senior developer, with the corresponding salary increase." (Or you could insert an extra step between junior and senior, and promote John to half-senior after one year, and to senior after two years.) That should explain things sufficiently.

I think the actual issue is that some of the current seniors are getting whatever they negotiated ten years ago, plus inflation, which is probably much lower than they could get currently. The junior negotiated his salary recently, so if you double *that*, it will be more than most current seniors have. And even if you keep salaries private, there is a chance that people will talk.

Expand full comment

I mentioned in a different thread that there's also a strong possibility your new hire is a dud, and you don't want to automatically give them a raise. Actually, you specifically may want to avoid giving them any additional money, as that could be brought up in court as evidence they were doing a good job before you fired them.

I don't know tech hiring well enough to be fully confident, but it sounds like the big companies have figured out how to evaluate and promote juniors along a career trajectory, such that most of this phenomenon is about the smaller companies struggling to figure out if/when their new employees are doing a good job.

When you bring someone onto your team, you see their shortcomings as well as their strengths, and you've watched (and paid for!) their growth. Smaller companies where the person deciding on pay rates has direct visibility into those new employees makes it hard for them to dispassionately hand out raises just because that person now has the resume to move up elsewhere. Imagine spending $200,000+ on a low performer's full compensation for a few years while they learn the ropes, and then thinking of giving them a huge raise about the time that they finally start pulling their own weight. Dispassionately, this may be needed as a long term cost of business. That's something a big company can think through and do, but a small company will struggle with.

Expand full comment

Great point about promotion as a potential "evidence before the court", I would not consider that.

The thing with "paying for their growth" sounds realistic, but it shows that there is some irrationality in the system. The basic question is, when you hire a junior developer, are you paying them (1) what they are worth *now*, or (2) *more* than they are worth now, because you expect them to get better later and repay their debt?

If it is (1), then it is not fair to feel like you "paid for their growth". You paid them exactly (in theory) what their *work* was worth for you at the moment. If you paid them more, it means you miscalculated. The experience they got is a side effect; if it wouldn't happen at your company, it would happen at some other place which would pay them about the same. (I am not talking here about the situation where the employer literally pays for *training*. In my experience that happens rarely.)

As a silly analogy, imagine that instead of software development, you are paying guys to move heavy objects from place A to place B. At the job interview you test them how many kilograms per hour they can move, and you set their salaries accordingly. After a year of hard work the new guy grows big muscles and now he can move twice as much, so he asks for a raise. Does it make sense to deny him the raise, because you paid for his muscle growth? In my opinion, no. You paid him for moving the heavy objects; the muscle growth just happened as a side effect.

If it is (2), the problem is that we have the unspoken "debt", which is not a part of the contract. So you do not give the former junior a raise, because he owes you for basically being overpaid during the first year. And if he switches jobs, he doesn't owe the new company anything, so he can ask them for a market salary. But it also means that if he switches jobs, his debt to you remains unpaid, right? But he has a legal right to leave at any moment; he is an employee, not an indentured servant.

So we have a kind of "debt" that is not legally enforceable and no one even knows how big it is -- the nice guys stay and pay it back (how long until it is actually paid back?), and the not-so-nice guys walk away and profit handsomely. This sounds like a situation that can create a lot of resentment. Not paying your debts is not nice. But acquiring a debt of unknown size, often without realizing it at the moment, is also not nice. On one hand, people are expected to behave like rational, profit-maximizing entities. On the other hand, there are unspoken debts of unknown size that there is a social expectation (albeit without legal enforcement) to honor. This all in a situation of information asymmetry (the employer knows the employee's salary, but the employee typically does not know enough to find out how much their work is really worth for the employer).

Could we avoid this all somehow? Maybe not. If an employer unilaterally decides to pay the juniors much less in order to avoid creating "debt", the juniors will simply apply elsewhere. A possible solution is simply not to hire juniors -- some companies actually do that. If many companies start doing that, the junior salaries will drop and a new equilibrium may be achieved when the juniors no longer acquire "debt". Or maybe the current situation will stay, because employees on average still pay their "debt". But as you said, "on average" works fine for big companies, but can cause problems for small companies.

Expand full comment

At the moment of hire, it's almost always #2 for a new junior. It often takes new employees a few months to even figure out the basics of how the company operates and what the company wants from the employee.

In your example of manual laborers lifting items, imagine instead of the employee earning an amount from what he can move, him trying and failing to move anything, or, perhaps he moved the wrong things or put them in the wrong places - in another words potentially a dead loss for the company. Even if this employee eventually figures things out and gets them right, there's going to be a period of time when the employee is worth less, maybe far less, than what they are being paid.

This is likely true for every position for some period of time. Even extremely low skilled work will involve a few days with a trainer and a period of time with low productivity. It could be a day or a week or a month, which in most contexts isn't much time and isn't worth worrying about, but it's still there.

Overall, this is a known problem and has existed for at least 30 years as a serious thing to consider. There used to be a norm of a company hiring and training a new employee, and then the employee staying for their entire career. This was considered good for both employee and employer.

I can't figure out who shot first, as both sides blamed the other. At some point employers got tired of paying employees to get trained for a few years and then potentially jumping ship, and employees didn't want to spend their lives at a company just because they started there. My best guess is that some newer employers decided to cut training costs by offering more compensation to the most experienced employees of established big companies, which broke the previous equilibrium. But, everyone recognized that an employee needed training and that a good employee was worth paying for. It's just been an ongoing process of trying to offload costs onto others. Notably this is likely a significant factor in the increasing costs of education - employers offloading training costs onto colleges, which are paid for by the student instead of the employer.

Expand full comment

Yes, the entire system (education + training + employment + salaries) now works differently, and people have probably not fully adapted yet.

I don't know how it is in other professions, but I suspect than in software development the situation is further complicated by people not having an idea how much certain combination of skills "should" be worth. There is a huge variation in productivity between individuals; a big variation between companies in how much they are willing to pay; and it is difficult to estimate someone's productivity. This results in a situation where the same person interviewing at multiple companies can get dramatically different offers.

Imagine that we made the debt explicit. For example, your employer says: "The first year I will pay you 100 gold pieces, but honestly you only deserve 50. However, I believe that the second year you will deserve 150 gold pieces, but I will again pay you 100, which is how you will pay back the debt." But the second year you meet another potential employer who says: "Actually, I believe that you now deserve 200 gold pieces, so come to me and I will pay you 200."

In this hypothetical scenario with explicit numbers, you could simply change the job, pay 50 gold pieces to your former employer, and keep the extra 50 for yourself. -- If the numbers are not explicit, you only have a choice between leaving 100 gold pieces on the table, or not paying back the debt of 50; both options suck for someone.

What I am trying to say is that there is a difference between "changing the job is profitable because you default on your implied debt" and "changing the job is profitable for debt-unrelated reasons, such as the new company is able to use your skills more profitably". The former is unfair; the latter is exactly how the market is supposed to allocate resources in theory. Without explicit numbers it becomes impossible to tell the difference; and realistically, it will probably often be a bit of both.

I could imagine a nice new equilibrium with a 4-day workweek and training new skills on Fridays -- paid by the employees; optionally subsidized by employers as a company benefit. (The current situation with 40-hour workweek and training in one's free time seems unsustainable, especially when you have kids.)

Expand full comment

Reading about the comments on programming salaries, I wonder if the readers are speaking from a non-US, non "mainstream" tech company POV (where "mainstream" includes big tech (eg: Amazon), medium-to-large start-ups (eg: Pinterest), and dated but still present companies (eg: Oracle)).

Context: I went to pretty good CS school, so me and most of my college friends all either work at mainstream companies or founded our own startups. Plus, most of us interned at 4-6 different places prior to graduating - so we've shared information amongst each other about a LOT of companies!

---

For "mainstream" tech companies in the US, what I've read couldn't be further from my impression of reality. Salaries at all of these companies are:

1. Much higher than described (even Oracle will give ~120k to a junior engineer!)

2. Much more transparent than described (salary sharing is much more common than other sectors I've worked in - levels.fyi has a TON of data)

3. Much more consistent within levels than described (bonuses reward strong contributors, but cash salary typically falls within a fixed range)

Also, promotions at these companies _are_ often commensurate with your skills, although as you get to higher levels, the limiting skills become more and more non-technical...because most tech companies don't need every engineer solving super complex technical issues! They need them solving *business problems* to create value for the company, which may involve challenges in areas like disambiguation, coordination, communication, or experimentation. Depending on whether you think these are part of your job will determine whether you think promotions are correlated with "ability" or not.

Yes, it is typically _faster_ to grow levels by jumping between companies, and can result in higher compensation due to the opportunity to negotiate, especially over equity and signing bonuses. But that's just the nature of having a negotiation opportunity and knowing how to use it than anything deeper.

That said, me and many friends from college have been promoted internally just fine, with increases in compensation commensurate with our new levels.

Expand full comment

As someone mentioned, in STEM the start up is 100k-1M. If you don’t get that back in grant funding before you are up for tenure - good luck. They would rather try again with a new Assistant Professor.

At the top schools, they get external reviews of the tenure candidates. When I was at one such, an Inorganic Chemist came up for tenure. They asked for rankings of him vs a list of other candidates in same field, similar institutions. He wasn’t 1 or 2 on the lists, so no tenure. They made an offer to the guy who was #1 on the most lists, he left another top school and is still there today.

Expand full comment

> At the top schools, they get external reviews of the tenure candidates.

I think they do this at practically all schools? Or at least top few hundred. From what I've seen in the sciences, it's only the top handful (in a given area) where they make a habit of not tenuring people.

Expand full comment

Yeah, I feel like the "supply and demand" argument - specifically the "too much supply, not enough demand" aspect - can carry the lions share of this.

A lot of the original article seemed to have "STEM is different" caveats and I think a good explanation of why STEM academia would be different is just that STEM academia has *far* more competition with the professional world. Being a STEM academic usually means passing up a fairly lucrative career as a professional.

But if you study Medieval French Literature... well there just aren't that many places hiring for Medieval French Literature specialists other than academia, so you end up with a glut of people with degrees competing for a few positions.

I feel like a lot of the issue is both on the modern "everyone should try to go to 4-year college at least" combined with a general lack of emphasis on a *career* plan. Fairly subjective, but in my high school and college years, I remember there being fairly little focus on picking a major with an eye towards future employment.

(Even for myself: I ended up doing a Math/Comp Sci double major and obviously half of that turned out to be very employable: but even I basically just majored in what I enjoyed doing, I just was lucky enough to enjoy a well-paying field)

---

I guess you could test this by looking at non-STEM fields that also have strong professional career opportunities, e.g. Law. IIRC, neither Scott's post nor Brett's specifically mentioned, but does academic law scene look more like STEM or does it look more like the rest of academia? If it doesn't look much like STEM, I guess there'd have to be a more STEM-specific explanation.

Expand full comment

Also there's a type of great student who isn't much use in the real world who tries to stay on the academic track for obvious reasons, without realizing how different being a very successful prof is from being a very good student. Being a student doesn't test same skills as being a professor, unlike many other pursuits where there's general alignment between how you try out and ultimate gig (not all NCAA bball players make the NBA, but being good in college ball generally translates to pro). So the supply/demand mismatch is even worse.

Expand full comment

Other than a grandfather who was always harping on about studying math, literally almost all the advice I got about college was “do what you like best”. Which was just absolutely garbage advice.

Expand full comment

It's good advice for a ton of people who won't work hard at things they don't enjoy. If you'd get As in something you like and Bs in something you hate, then it doesn't much matter how much more lucrative the second field is.

The right answer is: make a list of the things you're interested enough in to be prepared to really work at, then pick the most lucrative, highest demand field of those.

Expand full comment
Jun 6, 2023·edited Jun 6, 2023

> Several people disagree, and say these institutions exist (community colleges, technical colleges, etc), but upper-class people don’t go there because they’re not prestigious enough. What would it take to have something like this which was also prestigious? Or is that undesirable, because it would rob us of the research positive externality.

I would say the prestigious version of this institution *does* already exist, in the form of elite undergrad-only liberal arts colleges such as Harvey Mudd, Williams, or Amherst. The students tend to be very good, and with a lower teaching load and smaller classes than most places (funded by sky-high tuition), which attracts generally very good teachers. If I were going for a teaching-only career, I would definitely want to be at one of those places, even more so than a research-elite (a.k.a., "R1") university such as Berkeley or Stanford. The students are also excellent at top R1 places, but the classes (at least in computer science) can be huge. Perhaps the teaching load is higher for teaching-track faculty at R1 places than it is for professors at elite liberal arts colleges (not sure though, and it might vary by place).

But even at a undergrad-only elite liberal arts college, it's not *pure* teaching like at a community college. I believe some modest amount of research is expected and/or encouraged (all the professors there have PhDs generally for this reason), primarily because many of the students want to get into elite graduate schools, and (in STEM) doing research as an undergraduate is the best way to get into an elite graduate school. Still, I'm sort of guessing here, since I don't myself work at a liberal arts college.

Expand full comment

Undergrad science research isn’t only useful for getting into an elite grad school. It teaches skills that can be extremely helpful in entry-level science jobs.

Expand full comment

Also, doing a bit of research is the best way for non-specialist management to ensure that their professors are staying up to date in their field. Sure, you can stay up to date just by reading the literature, but the easiest way to do an external check is to look at whether they can still get a paper through peer review.

Expand full comment

No mention of pro sports, huh? Nearly all of the North American pro leagues at this point have had to include rookie scales in their collective bargaining agreements with the respective unions because of newly drafted players seen as having enormous potential getting paid more than established veterans. This wasn't even just pushed by the players, either. The owners were in part trying to save themselves from themselves because they know they're overpaying based on the high risk but do it anyway.

In part, I suspect this has somewhat to do with egos and the nature of "success" in some of these fields from the owner's perspective. Looking at high-profile flameouts like Dan Snyder recently, Donald Sterling with the Clippers, Frank McCourt with the Dodgers, they drove teams into the ground, failed over and over, put out a horrible product, got forced out of the league by other owners (or their own wives in McCourt's case), then sold for enormous multiples of what they purchased for and became or remained multibillionaires anyway. Being in charge of a university or owning Google is probably a lot like this. What does "risk" mean for you? Jeff Bezos probably lost the most net worth anyone has ever lost by getting divorced, yet he's still the world's third richest man. It impacted his ego only. That can't have had any material impact on his quality of life or ability to buy whatever the hell he wants to buy from now until eternity.

Harvard doesn't have owners, of course, but realistically what would it take for Harvard to ever not be considered a top ten school in the US and not have a 12-figure endowment or whatever it's up to? They're competing solely for bragging rights and bragging rights are all they can lose. To me, that explains a whole lot of apparently irrational market behavior because there aren't real markets, at least not competitive markets, if nobody involved can possibly suffer actual ruin and the marginal utility of a dollar is so close to zero that you have to have non-monetary goals like US News rankings or winning a Super Bowl or being the first to Mars.

Expand full comment

Various thoughts:

> Bret Devereaux is ... probably one of a tiny handful of academic historians I (and many other people) have heard of.

I doubt it matters. Academia is very inward-looking. As the joke goes, a physicist in Boston knows more about the physicists in Tokyo than he does about the chemists in the next door building at his university. The critical question is how well you are known by academic historians, especially the highest-status ones. Also, Devereaux complains that it's easier to become famous as a blogger than to get on the tenure track. I'm sure that's true, but OTOH being a famous blogger is still gig work rather than the iron-clad job security at high pay of a professorship at an R1.

There are lots of complaints that the end of mandatory retirement is clogging up the job pipeline. I've seen references that this is not so; professors generally do retire in the mid-60s.

This is different in STEM fields. For one thing, the students are worked hard in labs but generally don't have a heavy teaching load, even in math which has no labs and a high percentage of "service courses". But in STEM, it's easy to jump into non-academic employment, so there's an upper limit to the degree people can be exploited. (If you're in math, lots of technical employers will be superstitiously impressed, which can compensate for lack of demonstrable job skills.)

Also, I have seen recent reports in Science magazine of a shortage of postdocs, to the point that granting agencies won't allow you to put a high enough postdoc salary into a grant that you could actually hire a postdoc. But again, I'm sure that's because such people can bleed off into industry more easily.

I do wonder why so many people go into Ph.D. programs in non-STEM fields. It's well-known how gruesome the job market is.

Expand full comment

What else are they going to do? You've got a certain type of student in whatever type of humanities who really, really loves philosophy, romantic poetry, whatever. There's no job track that really makes any sense beyond phd, academia allows one to fend off the real world for a bit longer (particularly if parents helping fund studies). Plus the dumb and dumber angle: "so you're saying there's a chance . . . ."

Expand full comment

And honestly the students were (maybe still are) getting horrible advice. At least when I was an undergrad 20 years ago.

The recent grads who were new the department preached caution, but the old gaurd of professors who came up in the 60s and 70s were just about getting absolutely everybody into grad school. And if you were talented they were just like “who cares how much debt, it will all work out fine”.

Well it worked out fine for them because they came into the field in an era of 5:1 or 10:1 retirement matching. I know a professor who underestimated his retirement nest egg by like 6 times! (thought he had 600-700k instead had several mil).

And yeah a huge part of it is simply people not wanting to grow up, and sticking with what they know.

Expand full comment

> I’m confused by the “superstar” claim. Bret Devereaux ...

You're conflating two different senses of "superstar:" 1) Known to the general public via mainstream publishing , blogging, and podcasting, and 2) high reputation within academia.

The counterintuitive bit is that 1) is often assessed *negatively* in regard to 2), and it's 2) that dominates in hiring decisions.

Expand full comment

The commentary around software/IT doesn't match my experience at all. Everywhere I've worked has a fairly linear promotion/salary scale. And I've seen a significant number of people climb the ladder as well. I wonder what the disconnect is...

Expand full comment

I just finished a phd in STEM at a tier 1 name brand university in exactly 5 years. It was an extremely challenging process. My advice to phd students would be to read a good book on negotiation. At the phd level, when you graduate is completely subjective and determined by what you negotiate with your advisor, not your ability to complete mandatory requirements as in lower education levels. Even though many of my peers have performed better research, the thing that worked for me was that I got a job offer in an adjacent research-focused industry job, and they helped pressure my advisor to let me graduate and/or knowing that I had other options let it be more OK to my advisor that I was moving on. Of course this depends on your career goals, I didnt totally enjoy the phd process so I didnt want to put in more effort than I had to ( I mean, I love research but the low-pay and hours and power-dynamic with my advisor were not enjoyable). I have other friends who were on the cusp of discovering something big so stayed 1.5 more years to work on the same topic which ended up boosting their personal prestige and hireability. But if you are at year 3 or 4 and dont want to go to 7 or 8 years, it is good to start negotiating/strengthening your negotiation position for graduating.

Expand full comment

I'll note that this seems to be a very US thing, here in the UK they have quite strict deadlines for finishing STEM PhDs within 4 years, to they point where lots of people get hired on as postdocs for a few months just to finish things off in the lab before they move on to another job.

Expand full comment

Regarding job hopping in software dev (and other fields):

I think that many orgs end up hiring based on expected results, but give raises based more on effort and seniority.

Basically, when you're hiring, you don't know the folks and you're comparing everyone against each other fairly ruthlessly. You are in a good position to decide one new hire really is worth twice a different hire.

Performance reviews tend to be an awkward chore at most places. It's also just really tough to look at two people you know and like, see them both working hard and doing their best, and then decide one of them is worth twice the other. Human instincts around fairness pull hard on you - it isn't fair for two people to put in the same effort and one person gets twice the reward. So you end up with a system where everyone gets some modest raise every year with a bit of an adjustment for performance, rather than actually assessing everyone's performance every year and moving salaries up and down to match market values.

There can also be social problems if two people are hired around the same time but one of them takes off while the other doesn't. That can definitely create some resentment and feel like favoritism or bias. It's psychologically easier if everyone does the same little shuffle - I poach your guy and you poach my guy and nobody left behind has to sit there stewing about how they're being mistreated.

Expand full comment

For sure this is a big part of the need to switch jobs. People hate making their even weak friends sad.

That and I also think you don’t want to underestimate brute anchoring. “Sure we hired bob to be a secretary but he is the smartest and most effective person who works here”, generally doesn’t lead to bob’s boss or even his boss’s boss thinking bob should replace his boss, or get some giant change in status/pay.

Much easier for everyone if Bob just goes and becomes a boss somewhere else. Reduces the drama.

Expand full comment

Paul Graham talked about a similar thing discussing how YC or other good venture capitalists look for startups to fund - I can't remember which of his posts it was exactly though, but http://www.paulgraham.com/growth.html comes close.

The basic idea is screw expected value (that's what traditional investers are for), what's the chance of them going unicorn? From that perspective, something that has a 5% chance of becoming the next big thing and a 95% chance of failing, is a better startup than something that has an 80% chance of becoming a financially stable, so-so medium company but only a 1% chance of going big.

I presume it's the same story for TT hiring: the first thing you care about is the chance of them getting into Nobel Prize territory (though Fields Medals probably also cut it). I think the commenters who explained how proving you're good-but-not-great essentially locks you out of TT at an R1 have hit the nail on the head here.

Expand full comment

> Superstar effect is a big explainer. People want Steph Curry, and are willing to fight over potential candidates that have that potential. Experience does not turn [insert middling player] into Steph Curry.

Maybe not Curry, but this is pretty much what happened with Jokic, the best player in the league and already one of the best centers of all time at 28, who was infamously drafted 41st overall during a Taco Bell commercial. There was a post on /r/nba that looked at draft pick order and found that it had basically no predictive value after the first few slots. It is the case that most superstars go early, with some exceptions (the aforementioned Jokic, Kawhi Leonard, Kobe Bryant). But a lot more not-quite-once-per-generation stars end up being drafted much more towards the middle of the pack.

Expand full comment

Sounds like what Bryan Cranston said about acting. Basically - don't get into acting unless you can't bear doing *anything* else. If another job could make you happy, do that one.

Expand full comment

> Also, many many many people, both tenured professors and people who had dropped out of the academic track, commented to say that going into academia was a bad idea and that people who were considering it should think very carefully and probably change their mind.

It's very annoying that this is a meme I've more-or-less known about since forever and that it still didn't quite click until I was doing a PhD myself. Possibly part of it is that I know plenty of IRL people who told me otherwise, that doing a PhD was a good idea, that even if most people fail/hate it I wouldn't, etc. The vast majority of the "don't go into academia" advice I got was online and not targeted at me specifically. It was hard to really take it into account in the face of everything else.

Anyway, I now make sure to talk about this personally and face-to-face with everyone I know who is considering academia. But I don't think I've gotten through to anyone yet. Kind of hard for people to take you seriously when you're going "do as I say not as I do" on them.

Expand full comment

yup, cult socialization is an evil thing.

ime community influence can easily be strong enough, saw quite a few folks who got into rats in college get plenty of warnings from them and either reconsidered phd or were proceeding very cautiously.

Expand full comment
founding

I thought George Mason had all of those people because of the Mercatus Center, which has a clear ideological focus, which moved there because of a Koch donation back in 1980. I think the 'popularizer' impression is downstream of it being the place to be if you're a libertarian of that particular stripe.

Expand full comment

Your compensation numbers for FAANG compensation are wrong (way too low), but also don't model the dynamics well. There _are_ explicit job levels (ranks and titles) with formal promotion processes, which tightly band your comp. You _can_ absolutely get promoted inside the same FAANG you start at; it's a very metagamed thing with some toxic dynamics, but you absolutely don't need to leave to do it.

Expand full comment

> But why? Name recognition among other classical historians isn’t that useful; probably every academic classical historian knows every other academic classical historian, it can’t be a big job market. But if they’re going for name recognition among ordinary people, why not privilege the historians who have name recognition among ordinary people?

> This is the same question I ask about George Mason. Many people have remarked on how impressive it is that they have Tyler Cowen, Bryan Caplan, Robin Hanson, Garett Jones, etc, despite not being the sort of Ivy League school where you would expect famous people to congregate. The answer has to be that the department is selecting for Devereaux-like people with popular fame rather than academic fame. What tradeoffs are they making here, and have they paid off?

For the first paragraph, if you believe what Bret Deveraux says (on another topic), it's probably just a mistake on the part of the colleges.

Devereaux frequently writes about the difference between, in his terms, engagement and advocacy. Advocacy is when you attempt to harangue people into doing something that you want them to do, like dividing their trash into several different buckets. Engagement is when you do something that other people want you to do, like writing about a topic they're interested in.

Devereaux constantly comments that the rest of "humanities academia" sees advocacy as their duty. In their view, that is the purpose of being a professor. Devereaux contrasts his own view that advocacy is a luxury you indulge in for your own benefit, and engagement is what gives you the means to do so.

I think it's a small leap to the conclusion that universities have lost track of what gives them influence out in the rest of the world, so they aren't looking to make hires that would help with getting more of it.

For the second question, what I've always read about George Mason is that their strategy is ensuring that it's ok to not be left-wing on campus. It's not so much (according to what I've read, which hasn't focused on this question) that they're looking for popular appeal, it's just that anyone who's not very far left of center would naturally gravitate to them.

Expand full comment
Jun 7, 2023·edited Jun 7, 2023

"I have one former student who claims he got his job because he told his boss he understood stats well enough to tell good stuff from bad. He said when he saw 'bad' stats or research design or whatever, he'd just tell his boss, "If I turned that in to Dr X (me), the nicest thing he would do is give me an F, throw the work at me, and tell me to do it over." He became the go-to guy for stats in his job. He doesn't DO stats, he just knows them well enough to tell good from bad."

Quality Assurance can be a great path if you're good at thinking and intellectually well-rounded from studies/work but lack the specific technical skills. I can't program worth a damn, but my philosophy and logic genuinely really come in handy as a software tester.

It's been said that programming requires finding one way that works, while testing requires finding all the ways it could fail. Those are both different skillsets and mindsets.

Expand full comment

"Also, many many many people, both tenured professors and people who had dropped out of the academic track, commented to say that going into academia was a bad idea and that people who were considering it should think very carefully and probably change their mind."

Back to Bret Devereaux:

"So Should You Do it?

This is a tough question for academics to answer. Our profession runs on an apprenticeship system and so on a fundamental level we want younger scholars following in our footsteps. We’re excited to learn that our students are thinking of going forward in the field because we are excited in the field. Some of my past undergraduate students are doing their graduate studies now and it makes me smile with pride thinking about the great things they will learn, do and write. So it pains me to say that, in most cases, the answer is pretty clearly:

No, you should not."

https://acoup.blog/2021/10/01/collections-so-you-want-to-go-to-grad-school-in-the-academic-humanities/

Also, again, this probably changes for STEM. Since people with STEM skills can usually find better paid outside work easily enough, institutions have to be far, far nicer to them.

Expand full comment
Jun 7, 2023·edited Jun 7, 2023

I am a junior programmer, and the section on the programmer job market sounds completely foreign to me? Back when I was in college applying for new grad jobs, almost everyone I knew (in college and in Discord groups for CS majors) who ended up at an industry programming job at all was getting offers for at least $150k, though some people who really struggled ended up at jobs paying as low as $110k. A substantial proportion (40%+) of those new grad offers were from non-FAANG. My own lowest offer was $175k from a no-name public company. I don't recall jobs that pay <80k being ever discussed or mentioned.

(This was back before the market for software engineers crashed late last year).

Here are my hypothesis on how to explain the discrepancy:

1. Maybe the commenters who mentioned those $60-$80k junior programming jobs are not in the US.

2. Maybe job listings with "junior" on the title pay way less than those with "new grad" on the title. I and the people I know applied predominantly for "new grad" jobs.

3. Maybe there are people whose literal job title is "programmer" or "computer programmer," and they're paid way less than people whose title is "software engineer." I was under the impression that companies largely stopped using the "programmer" title because it makes the USCIS reject H-1B applications, but maybe there are a lot of companies out there who never hire H-1B and so still use that title, and they pay way less?

4. Maybe the commenters were talking about the status of job market 15 years ago.

In case (1) or (4) aren't the answer, I'm curious about the demographics of junior programmers who make 60-80k in the US. Do they typically have CS degrees? How do they hear about and get those jobs? Do they know about the existence of 150k+ entry-level jobs?

Expand full comment

5: Your friends and acquaintances are a very self selected group. Even Caltech says its grads have an average $120,000 starting salary. There is no way they would lowball themselves. MIT doesn’t give their numbers, but third party sources for MIT CS grads give a very similar figure.

And I wouldn’t discount the possibility of very strong selection bias. There is a classic Scott post about self selected groups here: https://slatestarcodex.com/2017/10/02/different-worlds/

Expand full comment
Jun 7, 2023·edited Jun 7, 2023

> $120,000 starting salary

That's the base salary, not total compensation.* I was mostly using total compensation numbers in my comment. Sorry for not making it clear. Base salaries among new grads I knew all fell in a pretty narrow range, between $110k and $150k outside of finance, and I don't recall ever seeing <$90k.

I assume companies that pay $60-80k don't give out a lot of equity (or any, for that matter) so comparing that base salary with the $170k+ total compensation offers I'm familiar with is probably fair.

* Universities almost never take into account bonuses and equity in those alumni outcome reports, even though they can be huge. For people who went to top schools, equity and bonuses often comprise the majority of their income. So those reports are not very useful.

Expand full comment

My theory of programmer salaries is that some programmer are chumps. Suppose a newbie is worth $75k and someone with a few years experience is worth $150k. You hire a newbie for $75k and three years pass. If he's smart he starts thinking "Man I could go to some other company and get $150k", but if he's dumb he never thinks that, he just keeps working for you providing experienced work at a newbie salary. If enough people are chumps, then the best strategy is to let your non-chump employees switch to a new company in order to get away with underpaying the chumps.

Expand full comment

Some more points re. the Programmer market, which maybe generalize to the academic job market:

> Why? People mostly speculated that the junior/senior distinction is reality-based; it takes 2-4 years to learn to do the most important tasks. The failure to hire insiders might be because workers aren’t on board with this distinction and would expect a more linear raise schedule.

It’s actually a lot more rational than that. First, there’s a risk/reward tradeoff related to promoting people internally. Due to inertia and the general difficulty in letting people go, there’s a risk associated with promoting an insider. It might not be obvious that they might lack the required skills, or otherwise might not be politically viable as a promotion candidate. On a skill side, if they are promoted too aggressively, there's a higher perceived risk that they gamed the promotion-evaluation systems. Someone recently promoted to a senior position is a lot more difficult to immediately let go of (and it's impossible to demote internal folk), so if let go too soon after a promotion, it shows up as an embarrassing mistake on the side of a manager somewhere. On the politics side, this enforces a relatively conservative set of requirements for promotion (the candidate has to have visible achievements that are exemplary or above average for someone working at the new role level). Which means, internally, managers will err on the side of caution and only “promote after someone’s been performing very well at the new level for a while”, which is usually uncomfortably late for everyone involved, and then puts pressure for a larger jump in comp. There’s also the angle related to hiring strategies - one might be less risk averse, and hire 4-5 risky juniors for every 1-2 people who eventually make it to a senior position. So there’s some pressure to recoup that cost. Ie the cost of training and education for 4-5 people, of which roughly, say, 50% might be let go of or otherwise leave before making it to senior. So, especially at mediocre or badly managed companies, there’s a pressure to try and recoup that cost by having folk who are starting to perform at senior levels stay at the lower tier or lower comp within the same bracket, for much longer than appropriate.

Looking at it from the point of view of hiring an outsider into a more senior position, it can look a lot less risky. The senior person will have 3-6 months to prove themselves, during which period they can be let go (or alternatively, demoted; it's actually a lot more reasonable to demote a new hire under the right circumstances) with a lot less political cost or fanfare. The risk/cost of training and education is paid for by someone else. The evaluation process for new hires is usually perceived to be much more selective and harder to game. To de-risk things, some of their higher compensation can be in the form of a signing bonus that gets clawed back if they aren’t a good fit. Similarly, on a political de-risking level, it's more easy to fudge their experience and historic achievements, to make it sound like they’ve accomplished enough of the politically necessary, high visibility, projects. So it's easier, especially with mediocre management in place, to justify paying a premium when hiring externally for more senior positions

As a caveat - talented managers will usually find ways to make the comp and promotion curve much smoother, even when forced to work within the bounds of an unreasonable external framework

----

This pattern can actually get really bad at higher echelons in large engineering orgs. FAANG like companies notoriously tend not to have enough of the required high visibility projects or work for folk to get promoted to the higher echelons of the engineering org at a reasonable rate (and to then be compensated appropriately). It’s often is much easier/faster to get to the higher levels and force a compensation correction by leaving, achieving the politically required high visibility work or projects elsewhere (eg. at an early stage startup) and then be acqui-hired or hired back one or two levels above the one the engineer quit at

Expand full comment

> This is the same question I ask about George Mason. Many people have remarked on

> how impressive it is that they have Tyler Cowen, Bryan Caplan, Robin Hanson, Garett

> Jones, etc, despite not being the sort of Ivy League school where you would expect

> famous people to congregate.

Most of these people -- if not all -- became influential after joining GMU. Robin Hanson was nearly a complete unknown when he took that job, as I recall. That is, he was unknown except in certain backwater internet communities where participants knew he was already a formidable intellect. I used to follow his science and tech conversations with Eliezer Yudkowsky back in the mid 90s and made a mental note to keep an eye on both of them.

Expand full comment

R1 classification is not accurately described as "(e.g., Ivy+, Stanford, MIT, ...)". It is mostly public state universities https://en.wikipedia.org/wiki/List_of_research_universities_in_the_United_States

Expand full comment