Last time, on Unfamiliar Letters…
Our hero wonders why the reality of academia is so different from its stated ideals. He reads about it and discovers that the higher learning in America combines three historical models, each with its own special intellectual mission and justification: moral formation (from the English college), economic benefits and technical training (from the American land grant), and disinterested theoretical research (from the German university). Also, there are sports.
For most of its history, American higher education is desperately poor and on the verge of collapse. How did it become the greatest system of higher education in the world? And then the mess it is today?
What can be done to save alma mater from her many perils? Come and see…
III. The Boom Years (1914-1974)
Most college and university leaders publicly supported World War I but privately worried about its effects on enrollment. How could they survive if their prospective students were all in Europe? In a neat coup, they convinced the US government, headed by Woodrow Wilson (himself a Johns Hopkins PhD and a former Princeton president), that American institutes of higher learning could support the war effort by turning over their grounds for training exercises, and of course by providing physical and mental training for the officer corp.[1] Scientific research also proved useful in the war effort, as when Harvard’s James Conant took charge of and accelerated the nation’s production of mustard gas. (Go Crimson!)
If World War I was proof of concept, then World War II was the product launch. The Manhattan Project, in particular, made quite an impression.[2] The importance of high-level scientific and technical research was now so obvious that the relationship between the US government and higher education nearly flipped. In 1944, it was President Roosevelt asking Vannevar Bush—director of the top-secret Office of Scientific Research and Development, former MIT professor, and eventual founder of Raytheon—how the OSRD’s application of “scientific knowledge to the solutions of the technical problems paramount in war” could also “be profitably employed in times of peace.”[3]
How nice of him to ask! Bush’s response was a report titled Science: The Endless Frontier. He promised that federal money for scientific research would bolster national defense, boost the economy, wage “the war against disease,” and train “that small body of men and women who understand the fundamental laws of nature and are skilled in the techniques of scientific research.”[4]
Lo, it came to pass. The National Science Foundation was created in 1950, and it joined the National Institutes of Health and the departments of Defense, Agriculture, Transportation, Energy, and Health in handing out research grants, especially for projects with clear defense implications.[5] Then came Sputnik. Federal funding followed that little metal orb right up through the mesosphere. “Between 1958 and 1964,” writes historian Paul Mattingly, “federal funding increased from $456 million to $1.2 billion, with basic research getting the lion’s share.”[6] As research budgets and facilities expanded dramatically, so did the schools themselves. Even the humanities received Cold War funding, as they were thought to promote good American liberal values and inoculate young minds against communist collectivism.[7]
Aside from research funding came an enrollment explosion. The GI Bill helped 2.2 million veterans go to college, bringing along $3.9 billion in tuition.[8] Although GI benefits were not distributed equally along racial and ethnic lines, they did bring hundreds of thousands of black, Catholic, and Jewish students into an old WASP institution. The GI bill made American higher education more inclusive and established it as a social elevator for ambitious youngsters of every kind, a reliable step toward high-paying, high-status careers.
Recognizing the astonishing success of the GI programs, the 1947 Truman Commission Report urged the federal government to expand its funding role even further, and to ensure that neither race nor wealth should bar talented, motivated students from attaining the highest degrees they could.[9] The commission estimated that 49% of the population would benefit from at least junior college, and 32% could earn a BA or a more advanced degree. Between 1955 and 1974, the college population more than trebled, from 2.5 to 8.8 million students, rising from 17.8 to 33.5% of the college-aged population.[10] The commission explained that while education was good for individual students and the economy, of course, it was also good for America: “Only [by expanding higher education] can we make sure that all who participate in democracy are adequately prepared to do so.”[11]
By 1970 American higher education had emerged in its mature form. At the top were about 100 research universities, followed by smaller state and regional universities, and then an even larger number of junior or community colleges below. Hundreds of small liberal arts colleges remained vibrant as well. The top universities (and elite liberal arts colleges) conducted most of the research and trained the elite white-collar workforce, while the regional state schools and community colleges were the main engines of social mobility. American higher education had gone from a plucky upstart to the best system in the world.
Is it any wonder that professors remember these as the good old days? For one thing, there were plenty of jobs. From 1955 to 1974, the faculty grew right along with the student body, from 266,000 to 633,000 members. But it wasn’t only that. Every part of the institutional mission seemed to be both working on its own terms and complementing all the others. Universities conducted theoretical research that led to practical advancements in computers and artificial intelligence.[12] They educated wave after wave of young Americans in the liberal arts and sciences, which gave these graduates the conceptual skills necessary to enter the nation’s professional and managerial classes, ready to organize an industrial society and then move beyond it into the information age. Higher education wasn’t just turning out a few leaders anymore. It was producing a leadership class.
Moreover, during the post-war boom, the success of US higher education was bound up with the success of the nation as a whole. Although the causes of American post-war prosperity are legion (it helped that every other major industrial economy lay in ruins), a major factor has to be the expansion and success of higher education. This country placed an enormous bet on the higher learning, and the bet paid off. At last, higher education had a role akin to the one John Eliot imagined for it in 1633: central to the good of the commonwealth.
In truth, though, the worm was already in the rose. As UC Berkeley president Clark Kerr said in his famous 1963 Godkin lectures, “The university is so many things to so many different people that it must, of necessity, be partially at war with itself.”[13] In the years to come, anyone looking closely could see cracks forming in the ivy-covered foundations. For example, if the schools were repositories of good liberal values, why were so many students at Kerr’s Berkeley and elsewhere in violent uproar? Couldn’t federal and corporate sponsorship influence the direction, or worse, the results of supposedly disinterested research? What was the exact relationship between education and a career, and did it really require so many classes in history and literature? And wasn’t there some tension between the egalitarian promise of mass education and the simultaneous production of what Kerr called “the elite of merit”?[14]
IV. A Troubled Giant (1974 to 2002)
Never mind all that, though. The institution was still expanding. The number of students between 1974 and 1986 swelled from 10.23 to 12.50 million. 75% of the buildings on American campuses were constructed between 1960 and 1986, leading historian John Thelin to quip that “the symbol for higher education during the cold war ought to be the building crane.”
While multi-year building campaigns were locking enormous costs into place, however, the funding model started to change. Partly due to a backlash against the student revolts, partly due to the tax revolts like the one in California, and partly due to the early-1970s recession, state support began to fall. Between 1975 and 1996 public institutions’ state and local support dropped from 61% to 51% of the budget, and federal funding fell from a high point of 23% in 1965 to 14% in 1996. “The overall pattern here is clear,” wrote Michael S. McPherson and Morton Owen Schapiro, “Tuition has been replacing government spending at both public and private institutions.”[15]
Federal tuition support changed, too. Inaugurated in 1965, the Pell Grant program carried on the spirit of the GI Bill by providing direct tuition assistance, although it was now need-based rather than a (purportedly) universal benefit for soldiers. By 1978, however, the majority of federal tuition aid was no longer grant-based. It took the form of loans.[16]
This followed the general trend of American life in the waning days of the Cold War. Public goods became private, and costs shifted from society as a whole to individuals and families. With public support falling and building and research costs rising, tuition could only go one way: up. Between 1974 and 1986 the average yearly cost at public institutions more than doubled, and at private institutions it nearly tripled.[17] Tuition had always been a part of American higher education, of course, but from the 1980s forward, student debt would become a more and more central feature of higher education.
By promising to train the nation’s social leadership class, higher education also inevitably found itself embroiled in American race relations. Shouldn’t the nation’s leadership reflect its overall ethnic composition? In order to meet their mission of mass education, and in accordance with the institutions’ self-images as bastions of egalitarian, liberal democracy, some in higher education tried to diversify the student body and faculty. The result was a power struggle over the purpose of the university, fought by three groups:
Those who saw minority admission as part of a larger effort to redefine the concepts of merit and the priorities of universities, those who saw the social and economic mobility of minorities as an important national goal that universities could serve in certain ways, and those who saw academic standards of merit as inviolable.[18]
The example of Joseph Watson is instructive. A black chemist trained at UCLA in the late 1960s, Watson was the provost of “Third College,” a self-consciously multi-cultural college atmosphere within the larger University of California, San Diego. He recruited minority students and, in the good collegiate style, set up special mentorship programs to help them excel in fields like math and the hard sciences, even if their high schools had ill-prepared them to do so. His middle way made almost nobody happy. Many of his colleagues outside of Third College saw his initiatives as a betrayal of the university’s traditional intellectual standards, while his more radical students saw him as a sellout. They wanted to discard both the old liberal arts curriculum and the German scientific method, and to reimagine the college as a training ground for radical activism.
These same three groups would duke it out roughly every 25 years. In the mid-1990s they fought over affirmative action and political correctness. From the mid 2010s to today, they are fighting over social justice and anti-racism. The underlying conflict is identical.
No matter who was right, the constant infighting further alienated the public and political figures, who were starting to ask pointed questions about skyrocketing tuition. Through the 1980s and 1990s, leaders in higher education blamed falling state and federal support, of course, but what about the consequences of their own endless appetite for growth? Their building campaigns had committed them to higher expenses for decades to come; scientific discovery required the latest labs and technology, which were unbelievably expensive; they expanded to accommodate more and more students, because the economy needed more educated workers, the country needed more educated citizens (and they needed the tuition dollars); vocational programs were vital, but even they seldom paid for themselves; and then there was the athletics department, which would surely make money once the new stadium was built! Colleges and universities would not stop growing. Or maybe they couldn’t.
Besides, the students kept paying. What choice did they have? By the end of the 1990s higher education was the only reliable path into the professional and managerial classes, at a time when outsourcing was devastating the old industrial working class. As gatekeepers, colleges and universities could charge a heavy toll. Oddly enough, students and their families wanted to pay. In what became known as “the Mount Holyoke phenomenon,” a college that raised its tuition could count on more qualified applicants, and more of them.[19] In the popular imagination, price meant prestige, and a prestigious degree meant an open door to success in American life. Falling state support, rising costs within colleges and universities, and student demand all conspired to send tuition through the roof.
By the turn of the millennium, higher education was bigger than ever, pursuing each of its missions—disinterested research, direct economic benefits, and the formation of the nation’s social leadership class—on ever-grander scales. But just as gigantism puts enormous pressure on the heart and bone structure, higher education’s endless growth had made it “a troubled giant.”[20]
III. My Bright College Years, Plus 8 Years of Grad School (2002-2020)
Looking back at my undergraduate and grad school days (roughly the 2000s and 2010s), I can now see that it was more of the same: expansion in every direction, but without the Cold War funding model that had made its boom years viable. By some numbers, things are better than ever. Overall enrollment rose from 16.61 million in 2002 to an estimated 19.78 million in 2021. As for scientific research, since 1980 the number of papers recorded by the Web of Science has at least quadrupled.[21] As for its economic impact, higher education now receives 2.6% of the national GDP, and in return it “has contributed uncountable improvements in economic development and human understanding.”[22] All true, and yet behind the numbers, every one of higher education’s traditional missions seemed to be going awry.
Take its traditional mission to educate the workforce and thereby contribute to the common wealth. Today over half of the college-aged population is in school, just as the Truman commission hoped. Especially after the financial crisis, more students than ever stampeded onto campus (a total of 21.02 million in 2011), all hoping to improve their economic prospects during the brutal post-crash recession. But as students were enrolling in record numbers, state support plummeted along with the stock market, pushing tuition even higher. As a result, today’s total student debt stands at $1.57 trillion. An educated workforce is good for the economy, but the situation is increasingly terrible for students themselves, who spend years if not decades paying off debt rather than building equity. Moreover, the rising number of graduates has eroded the college wage premium. In strict economic terms, students are paying more for less. Their careerism, clamoring for higher grades and a more career-focused curriculum, is an annoying but perfectly rational response to this state of affairs.
Plus, student careerism has its parallel in faculty careerism: the drive to publish as much as humanly possible. There are indeed more publications than ever. But does more publication mean more knowledge? In order to boost their numbers, scientists frequently split their research into LPUs, or “least publishable units,” which cannot be good for the dissemination of knowledge. New science centers and medical schools and whole research parks keep popping up around the nation’s campuses (just like new football stadiums), but these facilities’ incredible costs all but guarantee research with an eye to funding—whatever’s hot in the federal or private sector.[23] Is this really disinterested scientific inquiry?
Humanistic research is faring even worse. In the face of a job market with a casualty rate like the Battle of the Somme, the very survival of humanistic scholarship is in doubt. Graduate programs keep admitting more and more doctoral students, with the ostensible goal of training more faculty to do more research, but the tight job market and adjunctification have pushed an untold number of these aspiring scholars out of higher education. We will never know what they might have discovered if they had remained. The loss to scholarship and human knowledge is literally incalculable.
In terms of moral instruction, efforts to connect the liberal arts to American citizenship have been more noble than successful. For decades, moral formation had been slipping out of the curriculum, happening instead in extracurricular activities, on sports teams, etc. But in the last five years, morality has come roaring back. Its basic thrust could be summed up in a single word: “I examined eighty-hour strategic plans in U.S. research universities in the early 2000s,” writes Steven Brint, “the only theme each one shared in common was a stated and seemingly heartfelt commitment to diversity” [emphasis added].[24] The basic moral message of most colleges and universities today, especially as communicated in their official administrative statements and humanities classes, seems to be that an institution is morally sound if it reflects the ethnic and gender makeup of the nation at large, and ethically suspect if it does not.
Following the bacchanalian atmosphere on campuses in the 2000s (trust me, I was there), the recent moral fervor surrounding sex and racial issues is, on my reading, an effort to reassert some kind of old-fashioned collegiate moral control over campus life. For better and for worse. Notice how often, especially on issues of race, campus controversies quickly move from individual transgressions to larger issues of belonging, the community’s values, and feeling at home—all familiar concerns of the old residential college. The students themselves often call for closer moral surveillance and severer punishments, in order to encourage virtue. The old college ideal is back, with diversity in place of piety.
Certain reckonings have often been long overdue, but when these enthusiastic movements demand veto power over disinterested research, technical education, and basically everything else that goes on around campus, I part ways with them, believing that other intellectual traditions and missions have their own integrity and an equal right to a place on campus on their own terms.
In sum, it seems to me that each of higher education’s traditional missions is either at war with the others, undermining itself, or placing enormous strain on students, faculty, and administrators alike. The thing to see is that every step along the way has been logical and justifiable. Indeed, who could object to any of it? Who could be against more contributions to the local economy? Against more scholarly research? Against more diversity initiatives? You would be run out of a college town on a rail.
V. W(h)ither Higher Education?
I began this letter, Ruth (so long ago), with a quote from UC Berkeley’s late president Clark Kerr. He said that although institutes of higher learning liked to think of themselves as cloisters, cut off from the everyday concerns of the world, the reality is that they have “always responded, but seldom so quickly as today, to the desires and demands of external groups—sometimes for love, sometimes for gain, increasingly willingly, and, in some cases, too eagerly.” Seldom so quickly as today? That was 1963. President Kerr, you ain’t seen nothing yet. Today colleges and universities are trying to train the American workforce, contribute directly to the local economy, make scientific discoveries, preserve and disseminate humanistic learning, and, as if that were not enough, overcome long-standing racial inequalities by diversifying the nation’s professional classes—all on a scale that would have been inconceivable in 1633, or even in 1933. It’s hard to imagine an institution, of any sort, that is trying to meet more social needs.
Should it do less? There are ways in which higher education should probably cut back. It should, for example, cut its PhD cohorts in half, especially in the humanities. A miserable prospect, but at this point necessary, given the state of the job market.
In general, though, cutting back would be a disaster. What would happen if an institution tried to pump the brakes on any of its traditional missions? Between the nation’s changing demographics and the importance of enrollment for tuition, diversity is not only a moral mission, but also an existential issue. Without reaching new populations, colleges will face lean years. Many would probably close. Without disinterested scientific research, institutions would lose their most distinctive feature, and much federal funding. And although professors are fond of denouncing capitalism, just imagine what would happen if colleges and universities consciously uncoupled themselves from the American economy, to the point where a college education was no longer a prerequisite for a white-collar career. No need to imagine, actually. That’s how it was between 1633 and 1945, and schools and professors were desperately poor. Every time a senior faculty member complains about student careerism or the corporatization of campus, I can only agree—but I also wonder how much of their salary they would really give up to be free of those things. I doubt the answer’s much.
This is why I compare higher education to the sorcerer’s apprentice. It has summoned up forces, taken on missions, that have exceeded its power to control. They are now running with a momentum of their own, pulling the institutions along behind them, much to everyone’s detriment.
Can anything be done?
The usual answer is more public funding. Under current conditions, this is unlikely. The funding levels in the 1950s and 60s are a relic of the Cold War. I suppose there could always be a new Cold War with China, which would have the advantage of goosing budgets for scientific research but the disadvantage of nuclear brinksmanship. Not worth it, surely.
And yet, my historical survey suggests that American higher education was never stronger than when it was an arm of the military industrial complex. This has been an uncomfortable realization for me. Personally, I second William James, who in a 1906 speech at Stanford said, “I will now confess my own utopia. I devoutly believe in the reign of peace and in the gradual advent of some kind of socialistic equilibrium.” The trouble was, James admitted, that war was the best occasion for courage, heroism, and astonishing feats of both individual will and collective organization, it was “the great preserver of our ideals of hardihood, and human life with no use for hardihood would be contemptible.” It seems to me that these qualities, though seldom acknowledged, have also been essential for the historical success of American higher education. Its overall growth is the product of an almost mad ambition, just as every dissertation is a titanic, quixotic act of will. What would the world and/or campus be like without these martial qualities? “Fie upon such a cattleyard of a planet!” cries James. Fie upon such a cattleyard of a campus! cries Paul.
But how to preserve the swashbuckling side of life without, you know, “the bath of blood”? James’ answer was “the moral equivalent of war,” some grand undertaking that would call on these heroic virtues without all the killing. In a suggestion that laid the intellectual groundwork for the Civilian Conservation Corp, he suggested an “army enlisted against Nature” – that is, against famine, disease, hunger, and privation. (You can hear an echo of James’ rhetoric in Vannaver Bush’s promise that scientists could fight “the war against disease.”)
The war against nature has gone a little too well recently, but it seems to me that a moral war for nature, or more specifically against climate change, would work as well. Higher education could have an important role in that ambitious, world-wide campaign. It would be big enough, and touch on enough aspects of human life, to call on all the resources that our colleges and universities have to offer: disinterested science to model climate patterns and dream up new carbon requisition technologies; technical expertise to improve alternative energy sources and install them across the globe; the humanities to think through the moral dimensions and cultural consequences of such an undertaking (who gets what? who pays for it? what do we owe to those whose habitats or livelihoods we cannot save? etc). Not everything on campus would have to be related to the war for nature, of course, any more than everything in the 1950s and 60s was bent on beating the Ruskies. But it would get the institutional missions pulling in the same direction again.
And justify some more funding.
Yours,
Paul
[1] John Thelin. A History of American Higher Education, 2nd ed. Baltimore, MD: Johns Hopkins University Press, 2011. Pages 200-201. [2] Mattingly. Page 273. Cf. 297. [3] Franklin Delano Roosevelt. Essential Documents in the History of American Higher Education. Ed John R. Thelin. Baltimore, MD: Johns Hopkins University Press, 2014. Page 217. [4] Vannevar Bush. Science: The Endless Frontier. Ibid. Page 222. [5] Thelin. A History of American Higher Education. Page 272. [6] Mattingly. Page 290. [7] McCarthy, Kathleen D. “From Cold War to Cultural Development: The International Cultural Activities of the Ford Foundation, 1950-1980.” Daedalus, vol. 116, no. 1, 1987, pp. 93–117. [8] Ibid. Pages 280-3. [9] Thelin. A History of American Higher Education. Page 268-271. [10] Mattingly, 295. [11] George F. Zook. Thelin Ed. Essential Documents in the History of American Higher Education. Ed John R. Thelin. [12] Mattingly, 291. [13] Clark Kerr. “The Idea of a Multiversity,” The Uses of the University. Cambridge, MA: Harvard University Press, 2001. Page 7. [14] Ibid. Page 11. [15] Michael S. McPherson and Morton Owen Schapiro, “The End of the Student Aid Era? Higher Education Finance in the United States.” In: A Faithful Mirror Reflections on the College Board and Education in America, ed Michael C. Johanek. New York, NY: The College Board, 2001. Page 337. Statistics found in Table 1, page 367. [16] Thelin, pages 324-326. [17] Average Undergraduate Tuition and Fees and Room and Board Rates Charged for Full-Time Students in Degree-Granting Institutions, by Type and Control of Institution: 1964-65 through 2006-2007 [in current US dollars]. National Center for Education Statistics. Accessed June 19, 2021. https://nces.ed.gov/programs/digest/d07/tables/dt07_320.asp[18] Julie Reuben, “Merit, Mission, and Minority Students: The History of Debate Over Special Admissions Programs.” In: A Faithful Mirror Reflections on the College Board and Education in America, ed Michael C. Johanek. New York, NY: The College Board, 2001. Page 221. [19] Thelin, page 351. [20] Ibid., page 317. [21] Steven Brint. Two Cheers for Higher Education: Why American Universities Are Stronger Than Ever—and How to Meet the Challenges They Face. Princeton, NJ: Princeton University Press, 2018. Page 9. [22] Ibid. [23] Thelin, pages 382-4. Cf. 424-426. [24] Brint, 134.