Comment deleted
Expand full comment
Comment deleted
Expand full comment

> Completely exclude research from the main career path and consider imposing outright guild-like rules preventing your hires from even publishing research while they're employed with you, and perhaps for six months afterwards.

Psychology research student here (or, as the lingo goes, "early career researcher", aka "person we can get to do grunt work on the cheap").

Speaking entirely personally, I really wouldn't find this direction appealing. I am working on my PhD because I want to do research. Oh, I enjoy teaching, and would be totally fine with teaching being ~ 50% of what I spend my time doing, but a teaching-only position would be extremely unsatisfying to me, since research is what I'm primarily interested in - expanding human knowledge, learning new things, finding different ways to frame ideas and test theories etc. So being literally banned from doing it would cut the main motivation out, no matter how much I enjoy teaching.

I'm not even especially fussed on "prestigious institution". Oh, I see the benefits - more resources, access to "better" colleagues, etc - but I would rather do research at a less prestigious place than teach at a more prestigious place. Obviously I'm just an N of one, but speaking anecdotally this fits with the people I've worked with. If it helps I was raised solidly middle-class, and faced few serious barriers to an academic career (beyond structural ones that everyone has to deal with, like competition etc).

Expand full comment

Based on the way this ended, you should also give a link to Bret's own post he did earlier called "So you want to go to Grad School in the academic Humanities?" https://acoup.blog/2021/10/01/collections-so-you-want-to-go-to-grad-school-in-the-academic-humanities/

It is very similar to the advice given in section 6.

Expand full comment

Academia selects for tolerance of suffering. Seemingly infinite capacity to tolerate suffering.

I got my Physics PhD later in life and to get a postdoc would have meant uprooting the family, moving to a random place on this planet, then giving up on work/life balance for 2-4 years, then most likely getting washed up due to extremely high competition for a permanent position everywhere in the field. The situation is about 10x worse for theorists like me.

Instead I went back to industry to a comfortable living and a 9-5 job as a senior SWE. Not a happy ending, but at least I can tell a good paper from bad after reading the abstract, for fun. And tell if a sensational headline in my area in a pop sci journal/site is complete BS (spoiler: 99% of the time). It is somewhat satisfying.

Expand full comment

> But why? Name recognition among other classical historians isn’t that useful; probably every academic classical historian knows every other academic classical historian, it can’t be a big job market.

“Name recognition” isn’t exactly what they’re going for. Rather, every academic wants to hire the people whose work they are constantly using and thinking about, and the academics are in charge of hiring.

Prestige among one’s peers is also a nice thing, so they’re willing to hire the person whose work is useful to many others in the field, even if they personally don’t use it.

Name recognition is only a poor proxy for this kind of prestige, because as you say, most people in a field know most other even moderately prominent people in the same narrowly-enough defined field.

> This is the same question I ask about George Mason. Many people have remarked on how impressive it is that they have Tyler Cowen, Bryan Caplan, Robin Hanson, Garett Jones, etc, despite not being the sort of Ivy League school where you would expect famous people to congregate. The answer has to be that the department is selecting for Devereaux-like people with popular fame rather than academic fame. What tradeoffs are they making here, and have they paid off?

George Mason is an unusual case. I think it might have actually started with their law school going specifically for “law and economics” as their growth subfield. But their economics department very specifically selected for libertarianish heterodox people, who seem to speak to people outside of economics even if they don’t speak to economists. This is often a way that a lower prestige university can jump some rankings in a field, by taking a chance on becoming the specialist in some subfield that isn’t highly respected within the field, but might be in adjacent fields, or outside academia.

Expand full comment

> This is the same question I ask about George Mason. Many people have remarked on how impressive it is that they have Tyler Cowen, Bryan Caplan, Robin Hanson, Garett Jones, etc, despite not being the sort of Ivy League school where you would expect famous people to congregate.

I've mentioned this a few years ago, but I think it bears mentioning again. The standard rankings that people think of, mostly apply to undergraduate education. For graduate departments, every field has its own set of rankings, and different schools will sometimes have distinct specialties or areas of focus. The top undergraduate universities won't have truly *bad* departments in anything, because they need decent teachers and therefore they need decent grad students and therefore they need some decent professors doing research. But those departments might be mediocre backwaters from the perspective of the field. Meanwhile, other universities that aren't competing for the top undergraduate spots may have carved themselves out a niche in a few fields.

For instance, there might be a top undergraduate university that also happens to have a top law school, but, say, their linguistics department is a tiny thing that has a handful of "famous-in-the-field" professors but isn't really on the radar of the field as a whole. Or take the very name of the "Chicago school of economics"; I don't know what the University of Chicago's department of Economics is up to these days, but for a while it was an iconoclastic powerhouse.

The Economics department of George Mason might be going heavily for popularizers, but it also sounds like they have an affinity for approaches that resonate well with the rationalist crowd (if that isn't putting the cart before the horse, given that we're talking about Robin Hanson). And they may also have a political slant that allows them to "moneyball" talented professors who are being discounted by departments at other universities with different political slants.

Expand full comment

I don’t think the example with junior and senior devs is universal. Frankly, it feels very counterintuitive to me.

I am running a software engineering businesses (as CEO) for 12 years now. My first company was 200 engineers, the current one is 40.

Our best people were always inside talent, juniors we hired just after school (or even during) who learned and grew with the company.

Making a 100% salary raise over a course of single year is not common, but we had some such a cases. However, growing 25+% per year is quite common (100% over 3 years).

I am very confident that for super talented individuals it’s much easier to get to high position if they stay with the company than if they would change the job. Their new employer will see “not enough years in cv” and thus have negative bias, whereas we have insider info about their performance.

We always had a transparent salaries though, so everyone sees everyone’s else salary.

Expand full comment

>What tradeoffs are they making here, and have they paid off?

The tradeoff is that an otherwise boring run-of-the-mill woke institution like GMU has a persistent nexus of contrarian controversy. Here's an insider perspective: https://www.themotte.org/post/341/culture-war-roundup-for-the-week/60216

Expand full comment

In many ways this sounds really hopeful, doesn't it?

"Colleges really are looking for 'superstars'..."

So there actually is a massive institution dedicated to finding the smartest people? As with any institution, it generates a bunch of nonsense and weird boundary effects, but the point is that there really is a big network of people who are not trying to do anything other than identify great intellectual talent and put it to work? That's exactly what we want.

Expand full comment

Name recognition among ordinary people is not well-correlated with "prestige" in academia, which for a long time, though may be somewhat changing, follows what your letter-writers have to say about you when you come up for hiring, tenure, and promotion after tenure. And what letter-writers have to say about you have to do with your prestige reputation amongst the 20 or so peer researchers in your specific specialized subfield, who are those very letter-writers. And they're asked to say how good your latest research is, not what have you done for ordinary people lately. So the person who wrote the buzziest technical papers in the hottest area in classical history (or analytic philosophy, my field) will have a lot more prestige in the hierarchy game than the most famous public intellectual who sells a million trade books. Things are changing somewhat, but only in that some public intellectuals are getting rewarded more than they used to be, the prestige economy amongst trading letters about buzziest specialist research is still the dominant paradigm (maybe it should be?).

Expand full comment

Re: public intellectuals. There’s two kinds. (1) Someone who does major cutting-edge technical work first, then writes some books for the public (Murray Gell-Mann, Paul Krugman, Noam Chomsky, Richard Dawkins, Mary Beard), and (2) Someone who just makes a name writing for the public (Michael Shermer, Sam Harris). Only those in group (1) get to work at Princeton or MIT. At the snootiest of places, public writing is seen as more of a sideline or hobby. (BTW, I love the fact that the academic image for this post is Corpus Christi, Cambridge).

Expand full comment

Tangent about programming in big tech companies!

It's hard to overstate how difficult it is to tell if someone's work is good from the outside. If John's taking forever on a project this could be because he's really bad at coding, or the project could just be unexpectedly complicated. Or, commonly, it could be both! Often the project is *in the present* genuinely really difficult because John *in the past* built the system in a way that made it complicated to extend or modify.

(As an aside, I think this is very common– programming is hard for most people!– and this sort of accidental long-term self-sabotage is the entire reason the 10x programmer meme is real; it's comparing a honda civic with a honda civic that goes 1/10th the speed because someone took a sledgehammer to it.)

Anyway, these situations look identical to management, so the only people who can really judge your skills are your peers on the team, who know enough about your work and its context to tell if someone is productive or terrible. Of course, calling out another programmer on your team as sucking is a dick move. So real information about who is or isn't good doesn't propagate through the system efficiently and mostly gets inferred over time by management as someone accumulates a track record of successes or failures.

Software engineering at tech megacorporations is just Molochian principal-agent problems all the way down, which as Paul Graham observed is why startups can succeed at all– if you can build an environment where genuinely superstar programmers can work to the best of their abilities, where the systems all make sense and cleave reality at its joints, then I 100% totally believe you could run circles around a tech giant even with 1/100th of their money.

Expand full comment

>"I was confused by this - why are they giving some people a big raise after a few years, but not others?"

An alternate way to think about this that I find useful: managers basically pretend that people very rarely leave, little can be done in the rare case someone decides to leave, and no one can predict who leaves when. (this is very self-serving, as it absolves management of responsibility for retention, otherwise there might be political implications in the status game)

This means that you hire essentially under duress to fill staffing holes, and look for the best deal while doing so (i.e. hire lower-cost replacements, i.e. provide that lower-cost replacement a promotion), while long-term retention gets little investment. It just happens to save the company a lot of headline money (we hire people cheaper than our average salary!) while allowing the company to ignore the cost of churn (well, we can't control when someone leaves, that's just the cost of doing business)

This strikes me as generalizing well to academia if you consider "value" to not just be salary but some sort of nebulous "future potential"

Expand full comment
Jun 6, 2023·edited Jun 6, 2023

"But if they’re going for name recognition among ordinary people, why not privilege the historians who have name recognition among ordinary people?"

From my view as a consumer of pop science/pop history, there seem to be two reasons:

(1) Snobbery. So what if the unwashed masses know the name of Gerry Jones off the telly? Does he have the intersectional theoretical post-post-modern cred that will make the other institutes gnash their teeth and go "Why didn't we get him????"

(2) Quality. Often the guys (and girls) who make it into public consciousness *do* do so via "that guy off the telly" recognition. Dumbing down your subject to make entertainment does lead to a drop in quality - I've been terribly snobby myself about David Starkey, even though he is an established historian, due to all the BBC series he's done on the Tudors; his 2008 book about Henry VIII, "Henry: Virtuous Prince" where the chapters all read like television scripts (and may have been intended to lead in to the 2009 Channel 4 show "Henry VIII: The Mind of a Tyrant").

Do Neil deGrasse Tyson's forays onto Twitter do anything other than beclown him? Going popular/populist often seems to lead to a decline in quality,

Brett Devereaux may be known and famous to *us* but that doesn't mean the wider public in general know him, so he's not "Big Name off the telly" either.

EDIT: You do get the rare superstars who *are* big names in their field *and* get public name recognition, like Stephen Hawkings (or Einstein, the literal poster boy for this). But in general, the ones who go on to be "Mr/Dr/Prof Telly Expert" aren't the ones who are the cutting-edge in the field.

Expand full comment

>I was confused by this - why are they giving some people(programmers) a big raise after a few years, but not others?

Cause there are programmers who spend 100 hours making a button move 1 pixel to the left on a webpage while making 10x slower, while there are others who fundamentally understand how to make a 3d engine run on 90s hardware.

Computers do all the heavy lifting and if they are going to listen to you they start instantly. Knowledge work with a possibly if your good enough you can do things without a communication/paperwork overhead.

The surprising thing is that its only double.

Expand full comment

While we're on the topic (if we are) of popular versus academic work, here's a comedian flogging his history book:


Expand full comment

> One of the things I have long espoused, in the sense of thinking it but literally never telling anyone, is that the United States Congress should establish a system of National Teaching Universities that do exclusively undergrad and non-degree training.

Isn't this approximately the role that small liberal arts colleges fill? By definition, SLACs have a high professor-to-student ratio, and professors do little or no research. The main selling point is that undergrads will have direct contact (and potentially mentorship and project opportunities) with impressive professors. Many of these are high-status and charge high tuition, so they're attractive for professors in many fields (though perhaps not engineering or lab sciences).

High tuition and small class sizes exclude most applicants, of course, so this model doesn't make high-quality teaching widely available. But it *is* a system where professors' main responsibility is to teach.

Expand full comment
Jun 6, 2023·edited Jun 6, 2023

I'll only echo the advice I heard given to someone else by one of the organizers during a hard science career fair during a major conference in that field: If you're a man and your spouse is a woman and you're both academics (Ph.D or postdoc), she should try for academia (if she wants; she could also go for industry if she'd rather.). You should focus on industry. You do not stand a chance for a tenure track position unless you're very exceptional.

How this generalises to non-spousal situations is left as an exercise to the reader.

Expand full comment

A lot of this comes down to taking a risk on a new hire. You get someone right out of college or some kind of program, and really have no idea how good or bad they are. They don't have or haven't been able to demonstrate certain skills at all, which includes direct work performance but also sometimes basic things like "shown they can come to work on a consistent basis" or "doesn't flip out at coworkers." The person is dealing with the same phenomenon at every potential employer, so there's not much pressure to pay them a lot coming in. You can't usually lower a pay rate after you hire someone for various legal and psychological reasons. So your new hire comes in at what is honestly too low pay for the type of work they will be doing.

Then, one or two things happens. They either do well or they do poorly, where this can be both hard skills directly related to their job tasks and also soft skills like getting along with the team and communicating well. If they do poorly, you don't want to give them a raise, which means you are reluctant to set up any kind of automatic raises for years in the job. You want to give raises to people based on merit and what they've done for you (and there's also an incentive to not give raises to people who aren't asking for them, since that means less money being spent).

New hires with existing experience don't have this problem because you can review their actual work record and test their actual skills. Someone with five years in the field may have completed difficult job tasks or projects, gotten promotions, or can describe in intricate detail how they do their job. This means an employer can justify a certain pay rate even on initial hire, and know they aren't getting fleeced.

No one ever got in trouble for hiring a great employee with proven skills, even paying them what they're worth. Lots of hiring managers have gotten in trouble for hiring a dud that the company has to now deal with - either through legally risky and personally difficult discipline and firing, or through having an underperformer just stick around and drag the team down.

Expand full comment

Typo, section 1, 3rd bullet: "comment" --> "commitment"

Expand full comment

The comments about looking at the *potential* for new recruits reminded me of something someone said about the valuation of startups.

The idea was that as long as the company is not making a profit, it's valued based on its potential. As soon as it posts a profit, it's valued based on its return on investment. This causes its market cap to take a huge hit, because the potential is always much larger.

So if you don't have a track record, people look at your potential.

As soon as you do stuff, they can look at what you've actually done. Which is always much less golden than their fevered imaginings.

Expand full comment

Related to 1).

Another interesting set of phenomenon is that while academic institutions seem to want to cultivate superstars, they sort of active discourage this in the actual educational track. In undergrad and going into grad school, there seem a much higher value placed in conformity and parroting the right viewpoints, than there is for actual intellectual superstardom.

They would much rather have an “above average” player who does a meticulous job citing things and regurgitates the professors views, than someone who is higher horsepower with actual insights who is a little off kilter.

At least that was my repeated experience. “You were the best student by far in 48 of your 50 classes, but just blew off 2 for no reason due to irresponsibility” is seen as a big downside compared to the second best dude who gets an A in 50 courses but isn’t exceptional.

Which makes sense if you are looking to pump out cogs with high reliability, but doesn’t make a lot of sense if you are looking for superstars/intellectual advancement.

Expand full comment

I will second Simon’s perspective. Working on personal projects or school work teaches you how to code, but there is a large information gap between students and people actually working in industry. You see your first large codebase at a job. You see the types of tools and conventions that make working on a massive codebase possible at that job. You go through the horror or trying to set up your dev environment at your first job etc.

Juniors do legitimately get at least twice as useful if nor more after their first year or two so their pay naturally doubles. After that it levels off. The relevant question is that it does seem odd that companies are willing to let these quasi seniors go to another company. The answer as provided by commenters is that you only get that %100 pay increase once in your career once you learn what the industry is actually like, but workers don’t understand that. The company is forced to let a quasi senior go who is already integrated into the team in favor of a new quasi senior who has experience at a different company but at least has some experience.

Expand full comment

Reading about the comments on programming salaries, I wonder if the readers are speaking from a non-US, non "mainstream" tech company POV (where "mainstream" includes big tech (eg: Amazon), medium-to-large start-ups (eg: Pinterest), and dated but still present companies (eg: Oracle)).

Context: I went to pretty good CS school, so me and most of my college friends all either work at mainstream companies or founded our own startups. Plus, most of us interned at 4-6 different places prior to graduating - so we've shared information amongst each other about a LOT of companies!


For "mainstream" tech companies in the US, what I've read couldn't be further from my impression of reality. Salaries at all of these companies are:

1. Much higher than described (even Oracle will give ~120k to a junior engineer!)

2. Much more transparent than described (salary sharing is much more common than other sectors I've worked in - levels.fyi has a TON of data)

3. Much more consistent within levels than described (bonuses reward strong contributors, but cash salary typically falls within a fixed range)

Also, promotions at these companies _are_ often commensurate with your skills, although as you get to higher levels, the limiting skills become more and more non-technical...because most tech companies don't need every engineer solving super complex technical issues! They need them solving *business problems* to create value for the company, which may involve challenges in areas like disambiguation, coordination, communication, or experimentation. Depending on whether you think these are part of your job will determine whether you think promotions are correlated with "ability" or not.

Yes, it is typically _faster_ to grow levels by jumping between companies, and can result in higher compensation due to the opportunity to negotiate, especially over equity and signing bonuses. But that's just the nature of having a negotiation opportunity and knowing how to use it than anything deeper.

That said, me and many friends from college have been promoted internally just fine, with increases in compensation commensurate with our new levels.

Expand full comment

As someone mentioned, in STEM the start up is 100k-1M. If you don’t get that back in grant funding before you are up for tenure - good luck. They would rather try again with a new Assistant Professor.

At the top schools, they get external reviews of the tenure candidates. When I was at one such, an Inorganic Chemist came up for tenure. They asked for rankings of him vs a list of other candidates in same field, similar institutions. He wasn’t 1 or 2 on the lists, so no tenure. They made an offer to the guy who was #1 on the most lists, he left another top school and is still there today.

Expand full comment

Yeah, I feel like the "supply and demand" argument - specifically the "too much supply, not enough demand" aspect - can carry the lions share of this.

A lot of the original article seemed to have "STEM is different" caveats and I think a good explanation of why STEM academia would be different is just that STEM academia has *far* more competition with the professional world. Being a STEM academic usually means passing up a fairly lucrative career as a professional.

But if you study Medieval French Literature... well there just aren't that many places hiring for Medieval French Literature specialists other than academia, so you end up with a glut of people with degrees competing for a few positions.

I feel like a lot of the issue is both on the modern "everyone should try to go to 4-year college at least" combined with a general lack of emphasis on a *career* plan. Fairly subjective, but in my high school and college years, I remember there being fairly little focus on picking a major with an eye towards future employment.

(Even for myself: I ended up doing a Math/Comp Sci double major and obviously half of that turned out to be very employable: but even I basically just majored in what I enjoyed doing, I just was lucky enough to enjoy a well-paying field)


I guess you could test this by looking at non-STEM fields that also have strong professional career opportunities, e.g. Law. IIRC, neither Scott's post nor Brett's specifically mentioned, but does academic law scene look more like STEM or does it look more like the rest of academia? If it doesn't look much like STEM, I guess there'd have to be a more STEM-specific explanation.

Expand full comment
Jun 6, 2023·edited Jun 6, 2023

> Several people disagree, and say these institutions exist (community colleges, technical colleges, etc), but upper-class people don’t go there because they’re not prestigious enough. What would it take to have something like this which was also prestigious? Or is that undesirable, because it would rob us of the research positive externality.

I would say the prestigious version of this institution *does* already exist, in the form of elite undergrad-only liberal arts colleges such as Harvey Mudd, Williams, or Amherst. The students tend to be very good, and with a lower teaching load and smaller classes than most places (funded by sky-high tuition), which attracts generally very good teachers. If I were going for a teaching-only career, I would definitely want to be at one of those places, even more so than a research-elite (a.k.a., "R1") university such as Berkeley or Stanford. The students are also excellent at top R1 places, but the classes (at least in computer science) can be huge. Perhaps the teaching load is higher for teaching-track faculty at R1 places than it is for professors at elite liberal arts colleges (not sure though, and it might vary by place).

But even at a undergrad-only elite liberal arts college, it's not *pure* teaching like at a community college. I believe some modest amount of research is expected and/or encouraged (all the professors there have PhDs generally for this reason), primarily because many of the students want to get into elite graduate schools, and (in STEM) doing research as an undergraduate is the best way to get into an elite graduate school. Still, I'm sort of guessing here, since I don't myself work at a liberal arts college.

Expand full comment

No mention of pro sports, huh? Nearly all of the North American pro leagues at this point have had to include rookie scales in their collective bargaining agreements with the respective unions because of newly drafted players seen as having enormous potential getting paid more than established veterans. This wasn't even just pushed by the players, either. The owners were in part trying to save themselves from themselves because they know they're overpaying based on the high risk but do it anyway.

In part, I suspect this has somewhat to do with egos and the nature of "success" in some of these fields from the owner's perspective. Looking at high-profile flameouts like Dan Snyder recently, Donald Sterling with the Clippers, Frank McCourt with the Dodgers, they drove teams into the ground, failed over and over, put out a horrible product, got forced out of the league by other owners (or their own wives in McCourt's case), then sold for enormous multiples of what they purchased for and became or remained multibillionaires anyway. Being in charge of a university or owning Google is probably a lot like this. What does "risk" mean for you? Jeff Bezos probably lost the most net worth anyone has ever lost by getting divorced, yet he's still the world's third richest man. It impacted his ego only. That can't have had any material impact on his quality of life or ability to buy whatever the hell he wants to buy from now until eternity.

Harvard doesn't have owners, of course, but realistically what would it take for Harvard to ever not be considered a top ten school in the US and not have a 12-figure endowment or whatever it's up to? They're competing solely for bragging rights and bragging rights are all they can lose. To me, that explains a whole lot of apparently irrational market behavior because there aren't real markets, at least not competitive markets, if nobody involved can possibly suffer actual ruin and the marginal utility of a dollar is so close to zero that you have to have non-monetary goals like US News rankings or winning a Super Bowl or being the first to Mars.

Expand full comment

Various thoughts:

> Bret Devereaux is ... probably one of a tiny handful of academic historians I (and many other people) have heard of.

I doubt it matters. Academia is very inward-looking. As the joke goes, a physicist in Boston knows more about the physicists in Tokyo than he does about the chemists in the next door building at his university. The critical question is how well you are known by academic historians, especially the highest-status ones. Also, Devereaux complains that it's easier to become famous as a blogger than to get on the tenure track. I'm sure that's true, but OTOH being a famous blogger is still gig work rather than the iron-clad job security at high pay of a professorship at an R1.

There are lots of complaints that the end of mandatory retirement is clogging up the job pipeline. I've seen references that this is not so; professors generally do retire in the mid-60s.

This is different in STEM fields. For one thing, the students are worked hard in labs but generally don't have a heavy teaching load, even in math which has no labs and a high percentage of "service courses". But in STEM, it's easy to jump into non-academic employment, so there's an upper limit to the degree people can be exploited. (If you're in math, lots of technical employers will be superstitiously impressed, which can compensate for lack of demonstrable job skills.)

Also, I have seen recent reports in Science magazine of a shortage of postdocs, to the point that granting agencies won't allow you to put a high enough postdoc salary into a grant that you could actually hire a postdoc. But again, I'm sure that's because such people can bleed off into industry more easily.

I do wonder why so many people go into Ph.D. programs in non-STEM fields. It's well-known how gruesome the job market is.

Expand full comment

> I’m confused by the “superstar” claim. Bret Devereaux ...

You're conflating two different senses of "superstar:" 1) Known to the general public via mainstream publishing , blogging, and podcasting, and 2) high reputation within academia.

The counterintuitive bit is that 1) is often assessed *negatively* in regard to 2), and it's 2) that dominates in hiring decisions.

Expand full comment

The commentary around software/IT doesn't match my experience at all. Everywhere I've worked has a fairly linear promotion/salary scale. And I've seen a significant number of people climb the ladder as well. I wonder what the disconnect is...

Expand full comment

I just finished a phd in STEM at a tier 1 name brand university in exactly 5 years. It was an extremely challenging process. My advice to phd students would be to read a good book on negotiation. At the phd level, when you graduate is completely subjective and determined by what you negotiate with your advisor, not your ability to complete mandatory requirements as in lower education levels. Even though many of my peers have performed better research, the thing that worked for me was that I got a job offer in an adjacent research-focused industry job, and they helped pressure my advisor to let me graduate and/or knowing that I had other options let it be more OK to my advisor that I was moving on. Of course this depends on your career goals, I didnt totally enjoy the phd process so I didnt want to put in more effort than I had to ( I mean, I love research but the low-pay and hours and power-dynamic with my advisor were not enjoyable). I have other friends who were on the cusp of discovering something big so stayed 1.5 more years to work on the same topic which ended up boosting their personal prestige and hireability. But if you are at year 3 or 4 and dont want to go to 7 or 8 years, it is good to start negotiating/strengthening your negotiation position for graduating.

Expand full comment

Regarding job hopping in software dev (and other fields):

I think that many orgs end up hiring based on expected results, but give raises based more on effort and seniority.

Basically, when you're hiring, you don't know the folks and you're comparing everyone against each other fairly ruthlessly. You are in a good position to decide one new hire really is worth twice a different hire.

Performance reviews tend to be an awkward chore at most places. It's also just really tough to look at two people you know and like, see them both working hard and doing their best, and then decide one of them is worth twice the other. Human instincts around fairness pull hard on you - it isn't fair for two people to put in the same effort and one person gets twice the reward. So you end up with a system where everyone gets some modest raise every year with a bit of an adjustment for performance, rather than actually assessing everyone's performance every year and moving salaries up and down to match market values.

There can also be social problems if two people are hired around the same time but one of them takes off while the other doesn't. That can definitely create some resentment and feel like favoritism or bias. It's psychologically easier if everyone does the same little shuffle - I poach your guy and you poach my guy and nobody left behind has to sit there stewing about how they're being mistreated.

Expand full comment

Paul Graham talked about a similar thing discussing how YC or other good venture capitalists look for startups to fund - I can't remember which of his posts it was exactly though, but http://www.paulgraham.com/growth.html comes close.

The basic idea is screw expected value (that's what traditional investers are for), what's the chance of them going unicorn? From that perspective, something that has a 5% chance of becoming the next big thing and a 95% chance of failing, is a better startup than something that has an 80% chance of becoming a financially stable, so-so medium company but only a 1% chance of going big.

I presume it's the same story for TT hiring: the first thing you care about is the chance of them getting into Nobel Prize territory (though Fields Medals probably also cut it). I think the commenters who explained how proving you're good-but-not-great essentially locks you out of TT at an R1 have hit the nail on the head here.

Expand full comment

> Superstar effect is a big explainer. People want Steph Curry, and are willing to fight over potential candidates that have that potential. Experience does not turn [insert middling player] into Steph Curry.

Maybe not Curry, but this is pretty much what happened with Jokic, the best player in the league and already one of the best centers of all time at 28, who was infamously drafted 41st overall during a Taco Bell commercial. There was a post on /r/nba that looked at draft pick order and found that it had basically no predictive value after the first few slots. It is the case that most superstars go early, with some exceptions (the aforementioned Jokic, Kawhi Leonard, Kobe Bryant). But a lot more not-quite-once-per-generation stars end up being drafted much more towards the middle of the pack.

Expand full comment

Sounds like what Bryan Cranston said about acting. Basically - don't get into acting unless you can't bear doing *anything* else. If another job could make you happy, do that one.

Expand full comment

> Also, many many many people, both tenured professors and people who had dropped out of the academic track, commented to say that going into academia was a bad idea and that people who were considering it should think very carefully and probably change their mind.

It's very annoying that this is a meme I've more-or-less known about since forever and that it still didn't quite click until I was doing a PhD myself. Possibly part of it is that I know plenty of IRL people who told me otherwise, that doing a PhD was a good idea, that even if most people fail/hate it I wouldn't, etc. The vast majority of the "don't go into academia" advice I got was online and not targeted at me specifically. It was hard to really take it into account in the face of everything else.

Anyway, I now make sure to talk about this personally and face-to-face with everyone I know who is considering academia. But I don't think I've gotten through to anyone yet. Kind of hard for people to take you seriously when you're going "do as I say not as I do" on them.

Expand full comment

I thought George Mason had all of those people because of the Mercatus Center, which has a clear ideological focus, which moved there because of a Koch donation back in 1980. I think the 'popularizer' impression is downstream of it being the place to be if you're a libertarian of that particular stripe.

Expand full comment

Your compensation numbers for FAANG compensation are wrong (way too low), but also don't model the dynamics well. There _are_ explicit job levels (ranks and titles) with formal promotion processes, which tightly band your comp. You _can_ absolutely get promoted inside the same FAANG you start at; it's a very metagamed thing with some toxic dynamics, but you absolutely don't need to leave to do it.

Expand full comment

> But why? Name recognition among other classical historians isn’t that useful; probably every academic classical historian knows every other academic classical historian, it can’t be a big job market. But if they’re going for name recognition among ordinary people, why not privilege the historians who have name recognition among ordinary people?

> This is the same question I ask about George Mason. Many people have remarked on how impressive it is that they have Tyler Cowen, Bryan Caplan, Robin Hanson, Garett Jones, etc, despite not being the sort of Ivy League school where you would expect famous people to congregate. The answer has to be that the department is selecting for Devereaux-like people with popular fame rather than academic fame. What tradeoffs are they making here, and have they paid off?

For the first paragraph, if you believe what Bret Deveraux says (on another topic), it's probably just a mistake on the part of the colleges.

Devereaux frequently writes about the difference between, in his terms, engagement and advocacy. Advocacy is when you attempt to harangue people into doing something that you want them to do, like dividing their trash into several different buckets. Engagement is when you do something that other people want you to do, like writing about a topic they're interested in.

Devereaux constantly comments that the rest of "humanities academia" sees advocacy as their duty. In their view, that is the purpose of being a professor. Devereaux contrasts his own view that advocacy is a luxury you indulge in for your own benefit, and engagement is what gives you the means to do so.

I think it's a small leap to the conclusion that universities have lost track of what gives them influence out in the rest of the world, so they aren't looking to make hires that would help with getting more of it.

For the second question, what I've always read about George Mason is that their strategy is ensuring that it's ok to not be left-wing on campus. It's not so much (according to what I've read, which hasn't focused on this question) that they're looking for popular appeal, it's just that anyone who's not very far left of center would naturally gravitate to them.

Expand full comment
Jun 7, 2023·edited Jun 7, 2023

"I have one former student who claims he got his job because he told his boss he understood stats well enough to tell good stuff from bad. He said when he saw 'bad' stats or research design or whatever, he'd just tell his boss, "If I turned that in to Dr X (me), the nicest thing he would do is give me an F, throw the work at me, and tell me to do it over." He became the go-to guy for stats in his job. He doesn't DO stats, he just knows them well enough to tell good from bad."

Quality Assurance can be a great path if you're good at thinking and intellectually well-rounded from studies/work but lack the specific technical skills. I can't program worth a damn, but my philosophy and logic genuinely really come in handy as a software tester.

It's been said that programming requires finding one way that works, while testing requires finding all the ways it could fail. Those are both different skillsets and mindsets.

Expand full comment

"Also, many many many people, both tenured professors and people who had dropped out of the academic track, commented to say that going into academia was a bad idea and that people who were considering it should think very carefully and probably change their mind."

Back to Bret Devereaux:

"So Should You Do it?

This is a tough question for academics to answer. Our profession runs on an apprenticeship system and so on a fundamental level we want younger scholars following in our footsteps. We’re excited to learn that our students are thinking of going forward in the field because we are excited in the field. Some of my past undergraduate students are doing their graduate studies now and it makes me smile with pride thinking about the great things they will learn, do and write. So it pains me to say that, in most cases, the answer is pretty clearly:

No, you should not."


Also, again, this probably changes for STEM. Since people with STEM skills can usually find better paid outside work easily enough, institutions have to be far, far nicer to them.

Expand full comment
Jun 7, 2023·edited Jun 7, 2023

I am a junior programmer, and the section on the programmer job market sounds completely foreign to me? Back when I was in college applying for new grad jobs, almost everyone I knew (in college and in Discord groups for CS majors) who ended up at an industry programming job at all was getting offers for at least $150k, though some people who really struggled ended up at jobs paying as low as $110k. A substantial proportion (40%+) of those new grad offers were from non-FAANG. My own lowest offer was $175k from a no-name public company. I don't recall jobs that pay <80k being ever discussed or mentioned.

(This was back before the market for software engineers crashed late last year).

Here are my hypothesis on how to explain the discrepancy:

1. Maybe the commenters who mentioned those $60-$80k junior programming jobs are not in the US.

2. Maybe job listings with "junior" on the title pay way less than those with "new grad" on the title. I and the people I know applied predominantly for "new grad" jobs.

3. Maybe there are people whose literal job title is "programmer" or "computer programmer," and they're paid way less than people whose title is "software engineer." I was under the impression that companies largely stopped using the "programmer" title because it makes the USCIS reject H-1B applications, but maybe there are a lot of companies out there who never hire H-1B and so still use that title, and they pay way less?

4. Maybe the commenters were talking about the status of job market 15 years ago.

In case (1) or (4) aren't the answer, I'm curious about the demographics of junior programmers who make 60-80k in the US. Do they typically have CS degrees? How do they hear about and get those jobs? Do they know about the existence of 150k+ entry-level jobs?

Expand full comment

My theory of programmer salaries is that some programmer are chumps. Suppose a newbie is worth $75k and someone with a few years experience is worth $150k. You hire a newbie for $75k and three years pass. If he's smart he starts thinking "Man I could go to some other company and get $150k", but if he's dumb he never thinks that, he just keeps working for you providing experienced work at a newbie salary. If enough people are chumps, then the best strategy is to let your non-chump employees switch to a new company in order to get away with underpaying the chumps.

Expand full comment

Some more points re. the Programmer market, which maybe generalize to the academic job market:

> Why? People mostly speculated that the junior/senior distinction is reality-based; it takes 2-4 years to learn to do the most important tasks. The failure to hire insiders might be because workers aren’t on board with this distinction and would expect a more linear raise schedule.

It’s actually a lot more rational than that. First, there’s a risk/reward tradeoff related to promoting people internally. Due to inertia and the general difficulty in letting people go, there’s a risk associated with promoting an insider. It might not be obvious that they might lack the required skills, or otherwise might not be politically viable as a promotion candidate. On a skill side, if they are promoted too aggressively, there's a higher perceived risk that they gamed the promotion-evaluation systems. Someone recently promoted to a senior position is a lot more difficult to immediately let go of (and it's impossible to demote internal folk), so if let go too soon after a promotion, it shows up as an embarrassing mistake on the side of a manager somewhere. On the politics side, this enforces a relatively conservative set of requirements for promotion (the candidate has to have visible achievements that are exemplary or above average for someone working at the new role level). Which means, internally, managers will err on the side of caution and only “promote after someone’s been performing very well at the new level for a while”, which is usually uncomfortably late for everyone involved, and then puts pressure for a larger jump in comp. There’s also the angle related to hiring strategies - one might be less risk averse, and hire 4-5 risky juniors for every 1-2 people who eventually make it to a senior position. So there’s some pressure to recoup that cost. Ie the cost of training and education for 4-5 people, of which roughly, say, 50% might be let go of or otherwise leave before making it to senior. So, especially at mediocre or badly managed companies, there’s a pressure to try and recoup that cost by having folk who are starting to perform at senior levels stay at the lower tier or lower comp within the same bracket, for much longer than appropriate.

Looking at it from the point of view of hiring an outsider into a more senior position, it can look a lot less risky. The senior person will have 3-6 months to prove themselves, during which period they can be let go (or alternatively, demoted; it's actually a lot more reasonable to demote a new hire under the right circumstances) with a lot less political cost or fanfare. The risk/cost of training and education is paid for by someone else. The evaluation process for new hires is usually perceived to be much more selective and harder to game. To de-risk things, some of their higher compensation can be in the form of a signing bonus that gets clawed back if they aren’t a good fit. Similarly, on a political de-risking level, it's more easy to fudge their experience and historic achievements, to make it sound like they’ve accomplished enough of the politically necessary, high visibility, projects. So it's easier, especially with mediocre management in place, to justify paying a premium when hiring externally for more senior positions

As a caveat - talented managers will usually find ways to make the comp and promotion curve much smoother, even when forced to work within the bounds of an unreasonable external framework


This pattern can actually get really bad at higher echelons in large engineering orgs. FAANG like companies notoriously tend not to have enough of the required high visibility projects or work for folk to get promoted to the higher echelons of the engineering org at a reasonable rate (and to then be compensated appropriately). It’s often is much easier/faster to get to the higher levels and force a compensation correction by leaving, achieving the politically required high visibility work or projects elsewhere (eg. at an early stage startup) and then be acqui-hired or hired back one or two levels above the one the engineer quit at

Expand full comment

> This is the same question I ask about George Mason. Many people have remarked on

> how impressive it is that they have Tyler Cowen, Bryan Caplan, Robin Hanson, Garett

> Jones, etc, despite not being the sort of Ivy League school where you would expect

> famous people to congregate.

Most of these people -- if not all -- became influential after joining GMU. Robin Hanson was nearly a complete unknown when he took that job, as I recall. That is, he was unknown except in certain backwater internet communities where participants knew he was already a formidable intellect. I used to follow his science and tech conversations with Eliezer Yudkowsky back in the mid 90s and made a mental note to keep an eye on both of them.

Expand full comment

R1 classification is not accurately described as "(e.g., Ivy+, Stanford, MIT, ...)". It is mostly public state universities https://en.wikipedia.org/wiki/List_of_research_universities_in_the_United_States

Expand full comment