Imagine that you're an undergraduate who excels at science and mathematics. You could go to medical school and become a doctor. Or you could become a teacher. Which would you choose?
If you're in the United States, most students wouldn't see these as comparable choices. The average salary for a general practitioner doctor in 2010 was $161,000, and the average salary for a teacher was $45,226. Why would you choose to make a third as much in salary? Even if you care deeply about education and contributing to society, the opportunity cost for yourself and your family is enormous. Meanwhile in Finland, the general practitioner makes $68,000 and the teacher makes $37,455. Teachers in Finland are not paid as much as doctors, but Finnish teachers make more than half of what doctors do. In Finland, the opportunity cost of becoming a teacher is not as great as in the U.S.
The real problem of getting enough computer science teachers is the opportunity cost. We are struggling with this cost at both the K12 (primary and secondary school) and in higher-education.
I have been exchanging email recently with Michael Marder of UTeach at University of Texas at Austin. UTeach is an innovative and successful program that helps STEM undergraduates become teachers. They don't get a lot of CS students who want to become CS teachers — CS is among the majors that provide the smallest number of future teachers. A 2011 report in the UK found that CS graduates are less likely to become teachers than other STEM graduates.
CS majors may be just as interested in becoming teachers. Why don't they? My guess is the perceived opportunity cost. That may just be perception—the average starting salary for a certified teacher in Georgia is $38,925, and the average starting salary for a new software developer in the U.S. (not comparing to exorbitant possible starting salaries) is $55,000. That's a big difference, but it's not the 3x differences of teachers vs doctors.
We have a similar problem at the higher education level. That National Academies just released a new report Assessing and Responding to the Growth of Computer Science Undergraduate Enrollments (you can read it for free here, or buy a copy). The report describes the rapidly rising enrollments in CS (also described in the CRA Generation CS report) and the efforts to manage them. The problem is basically too many students for too few teachers, and one reason for too few teachers is that computing PhD's are going into industry instead of academia. Quoting from the report (page 47):
CS faculty hiring has become a significant challenge nationwide. The number of new CIS Ph.D.s has increased by 21 percent from 2009 (1567 Ph.D.s) to 2015 (1903 Ph.D.s), as illustrated in Figure 3.15, while CIS bachelor's degree production has increased by 74 percent. During that time, the percentage of new Ph.D.s accepting jobs in industry has increased somewhat, from 45 to 57 percent according to the Taulbee survey. Today, academia does not necessarily look attractive to new Ph.D.s: the funding situation is tight and uncertain; the funding expectation of a department may be perceived as unreasonably high; the class sizes are large and not every new hire is prepared to teach large classes and manage TAs effectively; and the balance between building a research program and meeting teaching obligations becomes more challenging. For the majority of new CS Ph.D.s the research environment in industry is currently more attractive.
The opportunity cost here influences the individual graduate's choice. The report describes new CS Ph.D. graduates looking at industry vs. academia, seeing the challenges of academia, and opting for industry. This has been described as the "eating the seed corn" problem. (Eric Roberts has an origin story for the phrase at his website on the capacity crisis.)
That's a huge problem, but a similar and less well-document problem is when existing CS faculty taking leaves to go to industry. I don't know of any measures of this, but it certainly happens a lot—existing CS faculty getting scooped up into industry. Perhaps the best known example was when Uber "gutted" CMU's robotics lab (see the description here). It happens far more often at the individual level. I know several robotics, AI, machine learning, and HCI researchers who have been hired away on extended leaves into industry. Those are CS faculty who are not on hand to help carry the teaching load for "Generation CS."
Faculty don't have to leave campus to work with industry. Argo AI, for example, makes a point of funding university-based research, of keeping faculty on campus teaching the growing load of CS majors. Keeping the research on-campus also helps to fund graduate students (who may be future CS Ph.D.s). There's likely an opportunity cost for Argo AI. By bringing the faculty off campus to Argo full-time, they would like get more research output. There's an associated opportunity cost for the faculty. Going on leave and into industry would likely lead to greater pay.
On the other hand, industry that instead hires away the existing faculty pays a different opportunity cost. When the faculty go on leave, universities have fewer faculty to prepare the next generation of software engineers. The biggest cost is on the non-CS major. Here at Georgia Tech and elsewhere, it's the non-CS majors who are losing the most access to CS classes because of too few teachers. We try hard to make sure that the CS majors get access to classes, but when the classes fill, it's the non-CS majors who lose out.
That's a real cost to industry. A recent report from Burning Glass documents the large number of jobs that require CS skills, but not a CS major. When we have too few CS teachers, those non-CS majors suffer the most.
In the long run, which is more productive: having CS faculty working full-time in industry today, or having a steady stream of well-prepared computer science graduates and non-CS majors with computer science skills for the future?