Dear Sondra,
A good measure of a profession’s health is the extent to which its practitioners can live out its myths. I mean myth in a non-pejorative sense: tales that present archetypal figures, narrative patterns, and ideas and values that give adherents a sense of themselves, their place in the world, and their overall purpose in it.
Many professions have myths. They are reenacted during collective rituals or represented by totems, which if small enough can be kept on hand for daily reminders. For example, most doctors still swear something like the Hippocratic oath during medical school graduations. It lays out their duties and connects them to a long line of physicians who have discharged these duties before, hopefully inspiring the new doctors to do the same in the future. Likewise, a journalist of my acquaintance keeps a bottle of whisky hidden in her desk. Not that she would ever drink on the job, but the liquid courage serves as a reminder of her forbearers, who certainly did and were also brave enough to stand up to numerous, better-funded foes.
As these examples suggest, professional myths are more than abstractions. They must come to life in professional practice. If they do not, then they will become myths in the more pejorative sense: old stories that nobody believes anymore.
One of higher education’s most venerable myths is the absent-minded professor. Heads in the clouds, lost in thought, their brilliant grasp of the most abstruse and difficult ideas equaled only by their inability to observe social niceties or tend to the basic necessities of life. My sense is that this figure is an endangered species today. It is disappearing from an environment in which social and institutional savvy are all but necessary for survival. I should hate to see them, and their myth, go the way of the passenger pigeon. They have brought great joy to me, among others, and oriented scientists and philosophers to their true north for thousands of years.
What Was the Nutty Professor?
Perhaps the original absent-minded intellectual was Thales of Miletus who, while he was walking along one day, was so engrossed in his contemplation of the heavens that he fell into a well. In the Theatetus, one of Plato’s dialogues, Socrates admits that “the same jest applies to all who pass their lives in philosophy.” Others forget where they are, or even when. Thomas Aquinas was once at a great feast with Louis IX of France, but while all the other guests were making merry, he said nothing to anyone for hours, until without any warning whatsoever the great doctor (great in intellect and girth) slammed his hand on the table and bellowed “THUS I REFUTE THE MANICHEES!” You will remember, Sondra, that the Manichees had been more or less extinct for a thousand years.
The unworldly scholar has been a stock character in fiction, too. In The Castaways, Jules Verne introduces Jacques Paganel, a geographer who once accidentally learned Spanish rather than Portuguese and joins the adventurers when he mistakenly boards their boat bound for Australia, instead of another boat bound for India. In a trope common to the absent-minded professor type, he looks ridiculous: “He wore a traveling cap, stout yellow buskins and leather gaiters, pantaloons of maroon velvet, and a jacket of the same material, whose innumerable pockets seemed stuffed with note-books, memoranda, scraps, portfolios, and a thousand articles as inconvenient as they were useless.” He deciphers puzzles and ancient languages, many times saving the crew from danger, but by the end of the novel he has also been kidnapped by the Maori and tattooed from head to foot.
Whether these descriptions are historical, fictional, or apocryphal matters little when it comes to their mythic significance. Reading of these intellectual and personal foibles, those with scholarly temperaments may well recognize a bit of themselves, smile fondly, and turn back to their “thousand articles as inconvenient as they [are] useless.”
Naturally, unworldly intellects have offered irresistible targets for satirists in every age. In Gulliver’s Travels, the titular character meets with a race, obviously inspired by Thales, whose “heads were all reclined either to the right or to the left; one of their eyes turned inward, and the other directly up to the zenith.” Unable to communicate with each other unaided, they depend on servants who slap them on the ears whenever someone wants to speak to them, or slap them on the mouth whenever someone wants to hear what they have to say. In another chapter, Gulliver visits a parody of the British Royal Academy, where one scientist “had been eight years upon a project for extracting sun-beams out of cucumbers,” while another scientist attempts to “reduce human excrement to its original food.”[1]
Although Swift’s scientists live in the fictional land of Lagado, they are recognizably English. Indeed, the quintessential absent-minded intellectual might well be a Victorian Englishman, an aristocrat or country vicar with next to no responsibilities and plenty of time for botanical experiments in his garden (getting those sunbeams back from the cucumbers, etc.). But we also meet a surprising number of absent-minded professors in 20th-cenutry American popular culture, especially at the movies. True to type, most are brilliant but socially oblivious, and loveably so. In Bringing Up Baby (1938), Gary Grant plays a paleontologist who is looking for a lost brontosaurus femur but—despite being chased around a Connecticut country house by none other than an amorous Katherine Hepburn—he cannot find his bone! The 1960s saw a parade of comedies with titles like The Nutty Professor and The Absent-Minded Professor, both of which were remade in the 1990s (the latter as Flubber). My point is that nutty professors belong to the American imagination. They are as familiar to us today as commedia dell’arte characters were to 17th-century Italians.[2]
And why wouldn’t they be? These absent-minded professors did not only exist on film. They were popular because they reminded the American public, which was increasingly going to college all through the 20th century, of characters from their own lives. To this day, I suspect that anyone who has attended an American college or university, let alone worked at one, has a handful of stories about the eccentric habits and manners of their favorite faculty members—available upon request.
Given that almost anyone can recall an absent-minded professor from their own bright college years, it may seem unlikely that they are, in fact, an endangered species. Yet I think they are. I contend that the American faculty is becoming less eccentric and more uniform by the year. Their uniforms, in fact, are a good place to start.
During graduate school, when my friends and I were on our way to the American Academy of Religion (AAR) annual conference, we used to play a little game: AAR or not? One of us would point to a stranger, and then the others would have to decide, based on the stranger’s appearance, if he or she were going to the AAR, or not. Our fellow scholars were laughably easy to spot. In five years of playing this game, I cannot remember a single time we decided someone was not going to the AAR, and then he or she turned out to be there.
I shall describe the men, as those who make light of the female academics’ impossible sartorial balancing act (always feminine, but somehow never sexy or cute) fall into deep shit. Besides, in my experience, the male graduate students were always the easiest to spot.
Few male graduate students look like Cary Grant, Sondra, as you have doubtless observed. They have neither his flawless features, nor his flawlessly tailored suits. Yet neither are there many Jacque Paganels. Aspiring male professors are almost invariably well-kempt, physically fit (relative to the American population as a whole), and handsome in an unobtrusive sort of way. Their clothes fit well enough, but are obviously off the rack, and they favor subdued earth tones for khaki pants and oxford shirts with lightly colored checks. The Brits would call it smart casual. It’s a completely unobjectionable look.
There are variations, of course, but almost all on this theme. For instance, tenured men, being a little older and better-paid and filling out around the middle, wrap the standard ensemble in a tweed jacket, a Christmas present from their spouse. If the conference takes place in a colder climate, the male scholar will sport a pull-over, zip-up sweater. (I myself used to have three.)
To be sure, some young men make what appear to be daring fashion choices, like bright red sneakers or glasses, but these are so obviously calculated to communicate a daring disregard for convention that they suggest the exact opposite: an obsession with how others perceive them.
Perhaps I am putting too much weight on outward appearance. Of course, you might say, everyone is in an unofficial uniform at the AAR—it’s a professional conference! True enough. Yet I contend that these fashions are the superficial signs of deeper changes in the profession.
What Happened to the Absent-Minded Professors? (Did They Just Wander Off?)
I am not the first to notice a certain sameness in the professoriate. After leaving higher education to join the War Department, Thorstein Veblen launched a “gas attack” (in the words of one reviewer) on his former profession. It was The Higher Learning: A Memorandum on the Conduct of Universities by Businessmen, which Veblen started in 1904 but only published in 1918, after he put some distance between himself and his former colleagues.
It is mostly a scalding critique of university presidents, boards, and administrators, but he also has plenty to say about his fellow professors’ failures to live up to their vocations. Veblen defines the fundamental characteristics of the scholar as “idle curiosity” and “an addiction to the pursuit of knowledge.” It is easy to see how these personality traits would encourage odd people and esoteric knowledge. But they also run contrary to the immediate needs of business enterprise, which required immediate, practical results. Alas, the power to hire, promote, and fire professors lay with the administration, which was to say with business-minded presidents, deans, and their donors.
Then, as now, academic appointments were few and far between. Even more than today, they were filled via a system of personal patronage. In order to have a career, it behooved the aspiring professor to adopt the administrator’s (read: businessman’s) manner of life and thought. The requisite social niceties weeded out anyone with an unconventional “domestic life, … racial, religious or political status,” Veblen wrote. Instead, academia tended to reward those whose “horizon [was] bounded by the same limits of commonplace insight and preoccupations as are the prevailing opinions of the conservative middle class. That is to say, a large and aggressive mediocrity is the prime qualification for a leader of science in these lines.”
Ouch. A gas attack indeed! But as Veblen was publishing his book, important changes were already afoot in higher education. First, bristling at the control of presidents (and donors) professors started to agitate for and win strong protections for freedom of inquiry. That is, the tenure system. Unconventional domestic, racial, religious, or political status were not adequate reasons for termination, and therefore could more easily survive. Second, the widespread adoption of a new model for research, imported from the German university, encouraged disinterested inquiry (eg, idle curiosity) and positively welcomed views that ran against the prevailing opinions of the majority. The German model also required extreme specialization, which was perfect for those whose addiction to study led them into the rarified heights of scholarship. In short, weirdness became an advantage, not a liability.
Finally, institutions of higher learning proliferated and expanded. That meant many, many more professors. Between 1955 and 1974 alone, faculty ranks swelled from 266,000 to 633,000. With so many positions available, professors no longer needed to adopt the dress and attitudes of the business class. They could be unworldly, laughably impractical in research and in life, and still get hired. When the great historian of American religion David Hollinger was finishing his dissertation in the late 1960s, he informed his advisor that he would only accept jobs in the western half of the country, ideally on the coast itself. He spent the next 50 years at UC Berkeley. I think it is not a coincidence that the nutty professor became a stock American figure at this exact moment, when the institution was growing to accommodate them, and millions of students encountered them.
Insouciance like Hollinger’s is unimaginable today. Despite my trolling subtitle, Sondra, I do not think that professors are boring. (Why, some of my best friends…!) What I do think is that the professional conditions in higher education today militate against the survival of certain personality traits that were until recently central parts of the professoriate’s self-conception. Today, the process of becoming a professor seems designed to weed out the most eccentric or, for those who aren’t complete weirdos, to sand off the rough and spiky edges of their personalities. Thus, I am not only concerned about the fate of individual nutty professors. I am concerned about what happens to the eccentricity in all of us under conditions of extreme competitive pressure.
About that competitive pressure: it is next to impossible for anyone to get a job, let alone the absent-minded. To review the grim numbers, in 2007-8 the MLA listed 3,506 job openings. In 2019-2020, that number was 1,411. Since the 2008 crash, the number of PhDs awarded in history have outpaced the number of jobs available by nearly 2 to 1. Each decent position receives hundreds of applicants. You had better odds at the Battle of the Somme. I doubt things are any better in my old field of religious studies, or yours of music history.
As the number of positions dwindles, there are not only fewer positions but also fewer topics. There have always been scholarly fads, of course, and not all fads are vapid, as topics often become popular when they’ve been unjustly neglected for too long. But with a declining number of jobs overall, the hottest topics in the field assume outsized importance. Today there is room for precious little else. This favors the savvy, who can smell a hot topic a mile off, while the absent-minded may not even realize that their particular interest has no future in the field until it is too late.
But why wait for the job market? Homogenization can begin far earlier in a grad student’s life. Anyone with any sense at all will realize how competitive the job market is and plan accordingly. They will specialize in a popular topic early and adopt the methods and scholarly standards of said subfield, identifying the existing intellectual norms and methods and following them slavishly, until they are internalized as second nature. On the one hand, this process of professionalization is perfectly in line with the old German research ethos of specialization. The students are learning the standards of scholarship, and how to meet them! But whereas in Veblen’s day specialization opened up new vistas of intellectual possibility, today it just as easily closes them down, as students must endeavor ever-harder to fill some narrowing niche, which is vigilantly patrolled by the field’s grand pooh-bahs—or peers with popular social media accounts.
It might be objected that scholarly publication requires originality, so it in fact favors those whose obliviousness protects them from group think. I would say scholarly publication requires originality up to a point, but no further. Most of today’s young scholars, finishing their dissertations, are far more accomplished researchers and writers than their counterparts 50 years ago. Those who get jobs now generally have two or three peer-reviewed articles to their names, whereas that would once have been a tenure-worthy CV. But they have measurable success precisely through learning, following, and internalizing existing professional standards. Their articles must say something new, but nothing too out of step with current thinking. Of course, every graduate student comes to campus with dreams of shaking the very foundations of his or her field, but the smart money is on those who quickly figure out how to “fill a hole in the existing literature.”
A few absent-minded types will stumble into campus interviews, of course, but the subsequent selection process does not favor them. On the contrary, as anyone who has passed through it will tell you, the campus visit requires the social savvy of a courtier, or even a courtesan. Every word of the job talk must be unimpeachable. Every potential interview question must be prepped for, and the answer calibrated perfectly to tell each member of the committee what he or she wants to hear. The candidate must present perfect comportment all through the campus visit, while the senior faculty on the hiring committee often act worse than donkeys. Is this a process that favors the absent-minded? The weird and sometimes quarrelsome? Or is it the kind of atmosphere that rewards the kind of intellectual kowtowing that Veblen skewered so deftly?
For those who win a place on the tenure track, the pressure only increases. So close to the prize, a faux pas must be avoided at any cost. After eight years of graduate school, the assistant professors spend another seven years intent on pleasing the journal referees, plus not ticking off their senior colleagues, lest they [the seniors] kneecap their juniors with a bad departmental letter to the tenure review committee right at the end of the 15-year marathon.
But once they earn tenure, can’t these professors then spring forth with all of their weird and radical ideas? Yes, isn’t it pretty to think so. Fifteen years of exquisite sensitivity to how others see you cannot be shaken off so easily. The problem is that behavior, even feigned behavior, becomes habit. Soon enough habit becomes character. And character becomes fate.
What is the result of this whole process? Some oddballs do slip through, of course, but on the whole, academia greatly favors the survival of those with a gift for what Gulliver, observing the court at Lilliput, described as “leaping and creeping.” That is, brilliant skill at social games.
Another way to see the sameness is by paraphrasing Veblen to the effect that the faculty’s “horizon is bounded by the same limits of commonplace insight and preoccupations as are the prevailing opinions of the [liberal] middle class.” It is a notorious fact that college professors are more liberal than the general population, and conservatives are forever calling them dangerous radicals, which suits most of the faculty’s self-conception quite nicely. The truth is less frightening but more embarrassing. A healthy majority of professors hold views that are pretty much indistinguishable from the rest of what John and Barbara Ehrenreich dubbed “the professional managerial class,” which is now almost entirely synonymous with the Democratic party and cosmopolitan finance capital. I still do not understand how all these groups came under one banner (it has something to do with the migration of ideas from academic departments to HR departments, and then back again into academic administration), but sure enough, polite political opinion in the faculty lounge generally runs the gamut all the way from Elizabeth Warren to Kamala Harris. If I had to choose, I suppose I would prefer our new Democratic HR overlords to the local Republican burghers, but what looks to some like the fall of western civilization is probably closer to a consolidation of existing corporate capital, albeit now more demographically diverse, and with equity as its keystone moral justification.
In other words, as in Veblen’s day, academia is conducted according to a certain set of business values, and it once again has little room for those who are unwilling, unable, or simply unaware that it’s time to fit in with the rest of the team.
Who Cares?
I do, obviously. I think eccentrics are good for scholarship and the expansion of human knowledge. A philosopher friend once told me that the most boring papers were the ones that were perfectly correct. Everyone just skimmed them through, nodding along, and then promptly forgot about them and read something else. The most exciting papers were interestingly wrong. In order for something to be interestingly wrong it must be in some way, askew from the normal ways of doing and seeing things, which is exactly the kind of thing one gets from people so lost in the intellectual heavens that they can’t see the social ground in front of them. These papers send everyone else scurrying after refutations and solutions—in short, heading in new directions. Sometimes, of course, it turns out they’re not wrong at all.
Also, I care because the loss of the myth has left its former devotees disoriented and unhappy. If I could use one word to sum up academia today, it would be demoralized. It is as if an earthquake had destroyed the temple, and the priests were wandering about dazed in its ruins. There are a hundred reasons for this demoralization, of course, but I contend that an important and little-noted factor is the loss of the scholar’s myth. I know perfectly well that American higher education was never really a cloister, never truly cut off from the concerns and needs of the outside, workaday world. But at least there used to be some separation. It has become almost impossible to believe that there’s much of a difference between working in higher education and any other professional sector. Stable academic employment these days entails an unexpected number of administrative duties, further blurring the line between the faculty and other sorts of professionals. There is less time than one might imagine to indulge “idle curiosity” or satisfy an “addiction to the pursuit of knowledge.” Professors are also surprisingly concerned with their own professional advancement, at least as intent on it as their fellow white-collar professionals. (I still remember nearly falling off my chair the first time I heard an associate professor refer to himself as “mid-career.”) Of course, as I have suggested at length, they have to be inordinately concerned with their careers to survive, but when professors no longer live out the old myth of disinterested contemplation, the difference between them and lawyers and accountants all but collapses. Identity crisis are to be expected, and they hurt.
I should pause to note that I have not been describing myself here. I do not count myself as among the absent-minded. On the contrary, as my success at the game of AAR or not? suggests, I have a good eye for social cues. The truth is that I am better at observation than abstract reasoning, more a Gulliver than a Paganel.
Yet I have always admired the Paganels. As for why, the answer is right there in the Theatetus. Socrates and his interlocutor try every which way to prove that true knowledge can be had by direct observation; that is, through concern only with the world as it appears in front of us. Every attempt fails. The implication, elaborated later in the theory of the forms, is that one must retreat from the apparent and into the ideal in order to truly know anything (even if it means you might fall into a well). At their best, the long contemplative flights recorded in scientific and humanistic articles and monographs are every bit as exciting as anything in Jules Verne, and like much science fiction, they turn out to be true, or at least change the way I see things. Of course, it is not only the absent-minded who take these flights; anyone who has written a dissertation has done so, no matter how little liftoff they actually achieved. But the kind of person I’m describing are, disproportionately, my intellectual cosmonauts. I should hate to see them grounded.
Can the absent-minded professor be saved? I do not know. I doubt it would occur to the truly absent-minded to ask. Instead, it would behoove you, Sondra, to keep one eye on the heavens, and one on the earth.
[1] Incidentally, Swift’s brutal satire remains as relevant as ever. Eight years is still about how long it takes to complete a doctoral program, and most of one’s time is indeed spent ruminating on the material, digesting it, and then trying to turn it back into something that doesn’t stink.
[2] You may have noticed, Sondra, that all of my examples of historical and fictional eccentrics have been men. I must admit that the mythic absent-minded professor is almost invariably male, and it wouldn’t take too much effort to whip up a stinging feminist critique of the type. To wit: 1) who, exactly, had to take care of these men’s worldly needs while they were peering up into the heavens? 2) how might the maleness of the myth predispose job committees, past and present, to prefer certain candidates? 3) how might the expectation of male eccentricity, closely allied to “genius,” funnel funding or other opportunities to men that might have gone to equally or more deserving women instead, and 5) oh by the way, hasn’t supposed absent-mindedness excused a great deal of bad behavior, very much including unwanted sexual advances? All fair points! All uncomfortably true, and I’m sure you could add more. I can see why feminist iconoclasts would want to shatter the icons of this myth entirely. I wish they wouldn’t. All I can say, for now, is that I have encountered at least an equal number of male and female eccentrics in academia. A further defense would require at least another essay.