Still Here: Change and Persistence in the Place of the Liberal Arts in American Higher Education

Preface & Introduction

The following essay offers a brief history of the idea and practice of the liberal arts in America.  It is based upon a rich body of writings by historians of American higher education.  A contributor to that literature, I am here reliant mostly upon the work of others, some my teachers and colleagues, others through their publications.  In discussing the recent past, I draw on six decades of personal experience as student, faculty member and erstwhile dean.  I welcome the invitation from Mariët Westermann  and Eugene Tobin of the Mellon Foundation to undertake this essay, but I have tried to do so in the way John Henry Newman commended “liberal knowledge”  to the Catholic gentry of Dublin in The Idea of a University:  “That which stands on its own pretensions, which is independent of sequel, expects no complement, refuses to be informed (as it is called) by any end, or absorbed into any art, in order duly to present itself to our contemplation.”1

A prefatory word on definitions.  The term “liberal arts” is used here as a noun to identify a collectivity of academic subjects and as an adjective to identify a certain set of academic institutions.  Both are more readily defined by what they exclude.  Louis Menand recently defined “the liberal arts [and sciences]” as “subjects of disinterested inquiry rather than those of professional or vocational education.”  Art history is one of the liberal arts; accounting is not.  I mean by liberal arts institutions free-standing four-year colleges and the undergraduate programs of universities that so identify their curricular programs.  Amherst College and the College of the University of Chicago are liberal arts institutions; the Harvard Business School is not.  I offer here a clunky and historically contingent definition:  “Certain instructional matter privileged by certain parties as especially suitable for certain classes of students, yielding them and society certain lifelong educative benefits.”2

A prefatory word about approach.  I view the history of American institutions of higher education primarily in terms of the relative agency over time of their principal stakeholders to effect or resist change.  These include:  students, parents and alumni; faculty; presidents and trustees; donors and foundation officials; legislators and the general public.  Change is facilitated when proposed for a part of the institution that the stakeholders proposing it have been traditionally ceded primary jurisdiction, and when they bring other stakeholders along.  For example, institutional finances are the province of the trustees and the president; the extra-curriculum, that of the students.  Change becomes contentious when proposed by stakeholders operating beyond their turf.  For example, when donors dictate faculty appointments.  The turf under consideration here is the curriculum, the privileged stakeholders the faculty.  Admittedly reductive, this approach is sufficiently alert to historical shifts in the relative power and purview of stakeholders within American higher education to keep in check inclinations to declare the sky is falling.  Historians make lousy catastrophists.3

1. The Cambridge-to-Cambridge Handoff

The intellectual roots of the liberal arts are to be found in the educational ideas of classical Greece (paideia) and Rome (Latin as the language of learning), in the medieval Catholic  universities (the trivium and quadrivium), and in Renaissance humanism (with the rediscovery of Greek classics).  England’s two original universities, Oxford and Cambridge, founded in the 11th and 12th centuries, fused these strains with the notion of a gentleman’s education, and, following Henry VIII’s break with Rome, the Reformationist doctrine acquired during the Marian Exile in the 1550s.4

However complicated and protracted the intellectual origins of the liberal arts, their American provenance is certain.  They came ashore board the Arbella and the subsequent 54 passages that made up the Great Puritan Migration to Massachusetts between 1630 and 1646.  They were included in the intellectual baggage of the 134 graduates of Oxford and Cambridge University who made the transatlantic passage and assumed leadership positions throughout New England.5

Most of these university graduates were Cambridge men, with a plurality (34) from Emmanuel College, a Puritan training ground founded in 1584 for the special purpose of providing the Anglican Church with a preaching ministry.  Emmanuel’s curriculum in the 1620s consisted of a four-year program of Latin composition, Greek, logic and philosophy.  Mathematics and science had yet to establish a significant curricular presence, the former seen as more suited to the training of merchants and mariners, not the prospective clergymen, statesmen and gentlemen who were Oxbridge’s stock in trade.  The order of instruction was determined by a student’s College tutor; its format consisted mostly of translation/pronunciation drills.  University oral examinations tested the degree of a candidate’s command of previously identified texts.6

The carriers of these notions of a gentleman’s proper education to New England were not disposed to modifying them.  As Samuel Eliot Morison wrote of the founders of Harvard:  “They had no new ideas on education.”  What Governor John Winthrop (Trinity College, Cambridge) and Cambridge’s minister Thomas Shepard (Emmanuel College, Cambridge) sought was a Puritan college safely placed in Cambridge, at equal remove from the surrounding wilderness, with its invitation to retreat into barbarism, and Boston, with its wily merchants and disruptive Antinomians.  Their anxiety is evident in New England’s First Fruits, America’s first fundraising brochure likely written in 1643 by Hugh Peter (Trinity College, Cambridge), where founding a “college” ranked immediately after four survival basics — houses/livelihoods/churches/ civil governments:  “One of the next things we longed for, and looked after was to advance Learning and perpetuate it to posterity; dreading to leave an illiterate Ministry to the Churches, when the present ministers shall lie in the dust.”7

Harvard’s founders assumed a sharp distinction between instructional matter suitable for men slated for careers in the church or magistracy and those foreclosed from university attendance for economic and class reasons (servants, renters, tradesmen), doctrinal deviance (Jews, Baptists, Quakers, Catholics) or gender (females).  The “liberal arts” suitable for “gentlemen” were not to be comingled with more practical/vulgar instruction useful in securing a living in the trades.  Harvard was to provide New England with a sufficiency of educated leaders charged with keeping the rest of humanity orderly.8

Two centennial historians of Harvard have persuasively insisted that Harvard was not founded as a theological seminary.  Yet early graduates became ministers more frequently than they took up any other calling and Hebrew in the curriculum speaks to early Harvard’s pulpit-filling purposes.  Graduates who did not enter the ministry were expected to become civic leaders and by and large did so. Meanwhile, Harvard’s brief flirtation with an Indian College in the 1640s seems to have been motivated by fundraising concerns more than any belief in higher education as an engine of social mobility.  A Harvard degree and the exposure to the liberal arts it attested confirm one’s high social standing, rather than providing a means of acquiring it. 9

If Harvard’s early leaders had no new curricular ideas, they did effect an important departure in the matter of collegiate governance.  Emmanuel and the other colleges of Cambridge and Oxford were governed by unmarried graduates who had subsequently been elected Fellows and lived in the College.  When Harvard got underway in the late 1630s, the General Court of Massachusetts placed the governance of the College in the hands of two bodies, both composed of ministers and public officials uninvolved with the College on a day-to-day basis.  This “external governance” arrangement was seen as a temporary deviation from Oxbridge practice until the College acquired a sufficiency of mature faculty, but once in place became permanent at Harvard and the norm for American colleges to follow.  Lacking authority in such basic aspects of college life as finances and the appointment of presidents, faculty would learn to exercise it elsewhere.10

2. The Not Harvards

Fifty-seven years passed between the founding of Harvard and a second college in English America.  In 1693, at the urging of the Rev. James Blair, representative of the Bishop of London in Virginia, the House of Burgesses authorized the opening of the College of William and Mary in the colonial capital of Williamsburg.  Its intended purpose was to provide the increasingly unchurched Virginians with a domestic supply of Anglican clergy.  Once open, William and Mary quickly devolved into a grammar school.  Its most famous graduate, Thomas Jefferson, found little to commend in its standard curriculum, availing himself of his time there to prepare for the bar.11

The founding of Yale in Connecticut in 1701 was a more substantive undertaking.  The prime movers were clergymen, most of them Harvard graduates, concerned with providing the colony with a steady flow of ministerial talent, but unwilling any longer to rely upon their alma mater to do so.  Harvard in 1700 had already moved far enough from its Puritan foundations to be unacceptable to Connecticut’s Presbyterian leadership.  Unlike William and Mary, most early Yale graduates did enter the ministry.  But like both Harvard and William & Mary, it did so without curricular innovation.  Yale’s instructional fare consisted of a required regimen of Latin and Greek, some mathematics, and a little science, after Harvard introduced it into its curriculum in 1715.12

If Harvard’s religious waffling begat Yale, Yale’s opposition to the Great Awakening  begat four of the next six colleges making up the “Colonial Nine.”  The College of New Jersey/Princeton (1746), The College of Rhode Island/Brown (1765), Queens College/Rutgers (1766) and Dartmouth College (1769) were all founded by dissenting religious communities imbued with the evangelical message of the Great Awakening as spread throughout the English colonies by the itinerant George Whitefield and by the native son and Yale graduate Jonathan Edwards.13

Meanwhile, the founding of King’s College/Columbia (1754) and the College of Philadelphia/Penn (1755) sought to provide the more cosmopolitan Anglican and non-evangelical Presbyterian families of New York and Philadelphia a less “enthusiastic” collegiate environment than their country-based competition.  Neither college became a significant producer of clergymen, but each within a decade of its founding added a medical school to its offerings, indicative of professional appropriations to come.14

While none of these colleges were distinguished by curricular innovation, the first president of King’s College talked a good game.  Samuel Johnson, a Yale graduate and student of Locke and Newton, promised New Yorkers the following instruction:

in the arts of numbering and measuring, of surveying and navigation, of geography and history, of husbandry, commerce and government, and in the knowledge of all nature in the heavens above us, and in the air, water and earth around us, and the various kinds of meteors, stones, mines and minerals, plants and animals, and everything useful for the comfort, the convenience and elegance of life, in the chief manufactures relating to any of these things.15

Johnson honored his commitment to science in 1757 by hiring a protégé of Harvard’s John Winthrop, Daniel Treadwell, as King’s first professor of mathematics and science.  Unfortunately for the cause of science and a broadened curriculum, Treadwell died in 1760.  By then Johnson had tired of presidential duties and his ambitious curricular scheme went unimplemented. His successor in 1763, the 27-year-old Myles Cooper, settled for a curriculum modeled on his own collegiate experience in “polite letters” at Oxford.  When the 18-year-old Alexander Hamilton arrived on campus in the fall of 1774, he skipped the required classical fare in favor of science courses taught by the College’s medical faculty.16

The arrival in 1768 of the Rev. John Witherspoon from Glasgow enlivened the Princeton campus and broke its string of early-expiring presidents.  But it produced only a modest expansion of the standing curriculum with the introduction of the philosophical writings of the Scottish Enlightenment.17

Of the 830 graduates produced by the nine colonial colleges between 1769 and 1775, upwards of 300 became minsters.  For these graduates, college work in Latin, Greek and Hebrew had a professional function.  But for the majority of Harvard, Penn and Columbia graduates who went into the law, medicine and business, the purposes of alma mater’s emphasis on ancient languages were ornamental and class-confirming.18

Ample evidence exists that some colonial collegians acquired a competency in Greek and Latin classical languages maintained in their subsequent lives.  The political pamphlets preceding the American Revolution and the ratification of the Constitution, embellished with learned references to and quotations from Republican Rome and Renaissance Florence, were mostly the work of college graduates.  In their post-presidential correspondence, John Adams (Harvard 1755) and Thomas Jefferson (William and Mary, 1762) traded texts in Greek to advance their arguments.  The iconography celebrating independence similarly displayed the fruits of the classical training of the colonial colleges.  E PLURIBUS UNUM19

3. The Ante Bellum College & Its Discontents

The successful outcome of the Revolution produced a rapid increase in the number and spread of colleges, while the nine antedating the Revolution, after selective rebranding to align names with the new political realities, all took up places in the cultural life of Early American Republic.  Most of the 50-odd colleges in operation in 1800 received public support, either legislative grants that helped underwrite their operations, or, in the cases of the University of North Carolina and the University of Georgia, both founded in 1785, opened as public institutions.20

The substantial public financing of so-called “private” colleges produced demands upon them to provide the new republic with an educated citizenry.  Those issuing these calls included the Philadelphia physician and Founding Father Benjamin Rush (Princeton, 1764), who in 1786 urged upon his fellow Pennsylvanians a broadened curriculum while taking a major role in the founding of Dickinson College.  Rush later proposed setting up a national university that would train officials of the federal government.  Similarly, the president of Bowdoin College, Joseph McKeen, observed in 1802, “We may safely assert that every man who has been aided by a public institution to acquire an education and to qualify himself for usefulness, is under peculiar obligations to exert his talents for the public good.”21

Except for the inclusion of civic considerations among material in the standard required course in moral philosophy generally taught by the college president to seniors, these calls went largely unheeded.  Coursework in American history, for example, was not to become part of the standard collegiate curriculum until after the Civil War.  Any campus discussion of national events or contemporary literature was the purview of the extra-curriculum, where student members of debating societies and the more literary social fraternities held forth. 22

A similar expectation attended the public support of private colleges of practical instruction reflective of the “business-doing” inclinations of tax payers.  In his campaign to establish the University of Virginia, Thomas Jefferson proposed a non-classical curriculum, replacing Greek and Latin with expanded instruction in science and modern European languages.  His wishes went ignored and a standard classical curriculum was installed.  The next serious move by a state institution to broaden the collegiate curriculum beyond classical languages would not be for another four decades, when Henry P. Tappan became head of the newly opened University of Michigan in 1852.23

Reader alert:   You are about to enter into contested terrain, at least for the three dozen American scholars who earn our keep as academic historians.  Columbia’s Richard Hofstadter, in The Rise of Academic Freedom in the United States, set the ball rolling in 1955 by memorably dismissing the ante-bellum era in American higher education as “The Great Retrogression,” coming after the impressive achievements colonial colleges made in advancing enlightenment ideas and educating the Founding Fathers and before (for Hofstadter) the much-to-be-applauded emergence of the University after the Civil War.  Follow-ups by like-minded revisionist scholars highlighted the shortcomings of ante-bellum colleges, not least their denominational character and curricular conservatism.24

These criticisms in turn prompted other historians to defend ante-bellum colleges as caring communities focused on students and providing a rigorous curricular fare of intellectual and moral substance.  Only the most self-denying of these re-revisionists resisted quoting Williams College alumnus James Garfield’s 1871 definition of an ideal college:  “Mark Hopkins on one end of a log and a student on the other."25

For all the discord, areas of agreement existed between the two camps.  If only half as many ante-bellum colleges failed as Hofstadter averred, most re-revisionists acknowledged that the Early American Republic had more colleges than popular demand justified.  Nearly all were under-enrolled and under-capitalized.  No ante-bellum college was secure enough to ignore the threat posed by a local competitor prepared to entice its students away with one inducement or another.  In the case of Yale in the 1820s, that meant not only Union College, already known for its poaching tendencies, but the recent openings of nearby Amherst College and Washington College (later Trinity) in Hartford.  But rather than devise innovations to distinguish their alma mater from the rest of the pack, college presidents and boards hewed closely to the status quo.  In curricular terms this meant staying with the required classical curriculum, supplemented with the fewest electives needed for, as the 1828 Yale Report complacently asserted, “keeping up with the times.”26

It was this same Yale Report, published in response to a legislative inquiry questioning the practicality of a curriculum focused on “dead languages,” that became the principal defense of just such a curriculum for the next two generations.  (It still has its admirers.)  While one part of the report famously asserted the singular efficacy of Greek and Latin to produce an educated man by providing him with both “the furniture and the discipline of the mind,” other parts implied that the genius of a required classical curriculum was less intellectual or pedagogical than operational.  A required classical curriculum could be installed quickly, cheaply, and then maintained with little overhead.  It required no library.  The fact that a given class moved line-by-line through an assigned text in Greek and Latin meant that few texts were required.  Instruction dispensed to classes of known size in a sequential fashion made the curriculum easy to administer. Students went along with these arrangement as long as they were not subjected to rigorous grading and retained their unity as a class.27

A classical curriculum posed few staffing problems.  Recent graduates could be hired as short-term tutors.  Older graduates who had entered the ministry but found pastoral duties unedifying constituted another source of recruits.  If such men took to the work, they might be kept on and in time promoted to one of the college’s professorships.  Advanced training in a given subject would be a plus, but could hardly be required when tutors were expected to provide instruction across the curriculum.28

For a generation after its publication in 1828, the Yale Report provided cover for poorer, smaller colleges with less access to specially trained faculty or library facilities than was to be found in New Haven.  Having publicly championed the virtues of the required classical curriculum, and become nationally identified with it, Yale and the dozens of its graduates who had taken up positions as college presidents and professors were saddled with it.  As one Harvard observer noted presciently in 1869:  “The inertia of a massive University is formidable.  A good past is positively dangerous, if it makes us content with the present and unprepared for the future.”29

Meanwhile, leaders of newly opening colleges — Transylvania (1780), Williams (1793) and Middlebury (1806) — expressed curricular ambitions that included fuller coverage of science and modern European languages.  Of the dozens of colleges founded in the four decades after independence, only Union College (1795) under the leadership of Eliphalet Nott (1805–61) carried through with a two-track curriculum that offered a concentration in the sciences and mathematics alongside the standard classical curriculum.  Impressed by the need for Americans with training in engineering demonstrated by the about-to-be-completed Erie Canal, and the failure of New York’s colleges to meet that need, the businessmen founders of Rensselaer Polytechnic School in Troy, New York in 1824 went still further offering a curriculum given over to the "application of science to the common purposes of life."30

In 1831 a group of New York merchants founded the University of the City of New York (later, NYU), promising a utilitarian curriculum to “correspond with the practical spirit of the age.”  Columbia College trustees promptly announced a new scientific and literary curriculum as a complement to its traditional classical curriculum.  Fortunately for Columbia, NYU quickly got itself into financial straits and ceased to be the competitive threat it initially posed.  Just as well because Columbia’s five-person faculty saw the new scientific curriculum, which consisted of a few additional courses in mathematics, rhetoric for juniors and outside tutoring in modern European languages as an added teaching burden and set about subverting it.  Students showed little interest in the new curriculum and resented its impact on class unity.  It was quietly discontinued in 1843.  Four years later, when the Free Academy of the City of New York opened as a municipally funded, tuition-free alternative to Columbia and NYU, both sets of trustees decried the use of public funds to educate the City’s lower classes.31

Efforts at curricular reform mounted at ante-bellum Harvard and Brown by strong-willed presidents in the 1840s testify to the staying power of the classical curriculum.  On accepting the Harvard presidency in 1829, Josiah Quincy inherited a course of studies which had changed little since his student days five decades earlier.  The freshman year consisted of 18 hours a week of recitations in Greek, Latin and mathematics.  At his inauguration the 59-year-old unrepentant Federalist promised only to “effect a more thorough education in the Greek and Latin languages.”32

What turned Quincy into a curricular reformer was the need to bring Harvard’s raucous student body to order.  (His predecessor’s tenure had been ended by a student rebellion.)  His solution was to undercut the power classes exercised collectively over their tutors by reducing the number of required classes and increasing non-prescribed (i.e., elective) courses available to students as early as the sophomore year.  The Corporation worried about the added expense of the expanded curriculum and the risks of premature innovation, while students complained about the assault on class solidarity.  But at least two faculty — the soon departing belle letterist George Ticknor and the brilliant young mathematician Benjamin Peirce — welcomed the reform as reducing their dealings with those Peirce called “the very dull sons of rich parents.”33

By 1841, all Harvard students beyond their freshman year elected their course of study, allowing Quincy on his retirement in 1845 to declare that Harvard was becoming a university in the European sense.  Three years later, his successor Edward Everett reinstalled the required curriculum for the sophomore class and extended required courses into the junior year.  The next three Harvard presidents accepted the status quo ante Quincy.  When Charles William Eliot became president in 1869, Harvard students were subject to a curriculum immediately recognizable to their grandfathers.34

The curricular reforms of Brown’s President Francis Wayland (1827–1855) were equally short-lived.  Both in his Thoughts on the Present Collegiate System (1842) and Report to the Corporation of Brown University (1850), he made the case for Brown moving away from the curriculum common “in all the northern colleges,” but also noted the risks an under-enrolled college took in doing so.  And if Brown, after eight decades of operations and located in a thriving industrial city could not risk waiting for a clientele for a non-classical curriculum to materialize, what of newer and poorer country colleges?35

The difficulties inherent in toppling the classical curriculum convinced two reform-minded businessmen in 1847 to attempt an end run by calling upon Harvard and Yale to establish separate schools of applied science and technology.  The Massachusetts textile manufacturer Abbott Lawrence did so with a gift to Harvard of $50,000 for the Lawrence Scientific School, while the Connecticut manufacturer Joseph Sheffield did so with a series of benefactions to Yale for what became the Sheffield Scientific School.  The gifts allowed both colleges to make new appointments in emerging sciences, but otherwise had little impact on students attending Harvard or Yale proper.36

The final example of an ante-bellum college stubbornly adhering to its classical curriculum is Columbia College.  Despite its location in the nation’s largest and fastest growing city, Columbia enrollments had not measurably increased between 1820 and 1854 and never exceeded 100.  During that entire period all instruction in the sciences was provided by a single member of its 5-person faculty, the autodidact James Renwick, Professor of Natural Philosophy.  His teaching repertoire included chemistry, geology, physics, canal-building and steam engines.  (He was also a student of the history of science.)  When a group of reform-minded Columbia trustees sought to use the College’s growing revenues from Manhattan real estate holdings to upgrade its scientific offerings, they forced Renwick’s retirement and tried to replace him with the German-trained chemist Wolcott Gibbs.  For his sponsors in the emergent American scientific community insisted, Gibbs represented Columbia’s best chance to shake its reputation as a sleepy place for the sons of the City’s Episcopalian gentry where only the classics were passably well taught.  When a majority of trustees could not be mustered in support of Gibbs, they settled for an undistinguished alternative and three more decades of curricular stasis.37

4. The University Turn

Two institutional types took the lead in transforming American higher education in the generation after the Civil War:  already established colleges that adapted to an underway reordering by becoming universities, and new institutions that set out to be universities de novo.  The first type included Harvard, followed by Columbia, and later by the more ambivalent Yale, Penn and Princeton.  The new second type included Cornell and then more definitively the Johns Hopkins University, followed by Stanford and the University of Chicago.38

One harbinger of change occurred in 1861 when Yale introduced the PhD degree into American usage by awarding it to two graduates who had continued formal studies in New Haven and submitted dissertations.  One reason Yale authorities gave for doing so was to provide their graduates with a domestic alternative to extended post-graduate study abroad.  Between 1817 and 1860 some 500 Americans had made academic pilgrimages to the University of Göttingen, Berlin or Heidelberg for graduate studies and, returning home with the honorific but occupationally vague title of “Doctor of Philosophy.”39

In 1862 Congress passed the Morrill Land Grant Act, which made grants of federal lands to the sixteen states of the Union to establish or support existing institutions for the following:

to teach such branches of learning as are related to agriculture and the mechanic arts, in such manner as the legislatures of the States may respectively prescribe, in order to promote the liberal and practical education of the industrial classes in the several pursuits and professions in life.

While explicitly providing for classical and professional studies, the law’s intent was to encourage the kinds of practical instruction that the nation’s private colleges, with few exceptions, had thus far declined to provide.  Western states with universities where the curricular fare mimicked that of the private colleges put their federal money into new schools of agriculture and applied technology. Massachusetts used its grant on the newly opened Massachusetts Institute of Technology.  The New York legislature put its money on the about-to-open Cornell University, where its founders were promising “any person can find instruction in any study.” 40

Still, too much can be made of these harbingers.  Yale’s domestication of the PhD did not produce an immediate stampede to New Haven.  Over the next decade Yale awarded only another dozen PhDs.  Cornell awarded its first PhD in 1872.  Harvard followed a year later and Columbia a year after that.  By 1875, these four universities had awarded a grand total of fifteen PhDs.  More was needed than an imported credential and federal legislation to kick-start the academic-revolution-to-come.41

Needed was sustained leadership of a personal and institutional sort.  The first appeared in 1869 in the person of the incoming 38-year-old president of Harvard University, Charles W. Eliot.  He came at a propitious time.  Harvard’s finances had improved over the previous two decades, thanks to Boston’s Brahmin class adopting the college as their philanthropy of choice and, along with Columbia, America’s wealthiest college.  By 1869 Harvard’s enrollments had caught up with Yale’s.  But there was also something present in Cambridge absent in New York or New Haven:  a pervasive sense among those in charge (the seven-member Corporation) of the need, after a quarter-century of four ineffective presidencies, for aggressive administrative leadership.42

Eliot did not disappoint.  His inaugural address laid out an ambitious agenda:  to make student choice and faculty specialization the twin hallmarks of his Harvard.  To advance the first, he was prepared to go beyond Quincy by eliminating altogether the requirement that a Harvard student’s academic program include Greek and Latin.  He also pushed to have elective subjects available at every point in a student’s four years.  That such open-ended arrangements made for organizational complexity and added expense, and invited charges of dilettantism from competitors, Eliot readily acknowledged.  He simply believed the complexity manageable, the expense bearable, the competition functional, and the criticism wrongheaded.  He was also persuaded that a system where students had a say in their programs would be a boon to recruitment and student order.  When challenged to explain why such fundamental changes were underway, he likely offered a variant of the explanation given to his medical faculty when they questioned changes he prescribed for them: “Because there is a new president.”43 

Where Eliot’s inaugural certainty flagged was where Harvard would find the men to teach his expanded curriculum.  That is surprising, given his own experience as an assistant professor of chemistry with local standing beaten out for a permanent position by an outsider with professional standing.  In the event, Eliot quickly found faculty allies to advance his elective system, some already at Harvard, others ready to come.44

The opening of the Johns Hopkins University in 1875 made clear that Eliot’s problem would not be finding men capable of taking up faculty positions in Cambridge, but once there keeping them from defecting to Baltimore.  What Hopkins offered ambitious scholars was an institution focused on graduate studies and possessed of fellowship funding sufficient to attract the brightest American college graduates to undertake such studies.  A single successful raid by Hopkins of a promising young German-trained classicist and President Daniel Coit Gilman’s active recruitment of the psychologist William James put Harvard’s president on full alert.45

By the mid-1880s, Eliot was able to draw upon a deepening pool of American-trained PhDs, some with Cambridge credentials but others by way of Baltimore. “Many of these young professors,” the atypical Harvard PhD George Santayana noted,

are no longer the sort of persons that might as well have been clergymen or schoolmasters:  they have rather the type of mind of a doctor, an engineer, or a social reformer; the wide awake young man who can do most things better than old people, and who knows it.46

The 575 PhDs Hopkins produced between 1878 and 1900, along with the more than 1000 produced by Yale, Harvard, Columbia, and Cornell, transformed American higher education.  Both at these universities and free-standing colleges, PhD-holders became the faculty norm.  At Oberlin, well into the 1880s, the College drew its faculty from graduates who taught a fixed curriculum of classical languages and evangelical apologetics.  By 1901, according to historian John Barnard, the College “had assumed the obligation of placing only professionally qualified teachers in its classrooms” and “the fixed curriculum of the liberal arts had almost vanished.”47

Once these first “wide-awake young men” — with their own professional identities specified by their degrees — were in place, they secured additional places  for other PhDs in the two academic specialties that possessed a sufficiency of PhDs for staffing purposes, a learned society to advance their cause, and a professional journal in which to publish their research.  Collectively these disciplines became the many- roomed mansion of “the arts and sciences,” or, as some stilled preferred, “the liberal arts.”48

Included in the liberal arts were the social sciences. Prior to the 1880s, issues such as poverty, crime, immigration, labor, capital and urban planning were the purview of civic-minded businessmen, clergymen, legislators and social workers.  Foreign relations were the province of international lawyers and missionaries.  By the early 1900s, all these traditional competences were being challenged by newly credentialed academics, with some of them, like Columbia’s E. R. A. Seligman in economics or Princeton’s Woodrow Wilson in political science, becoming nationally accepted authorities in their fields.  Anthropologists like Franz Boas came forward to challenge popular views on race and primitivism, while the historian Charles Beard set out to deprive the Constitution of its sacred status.  In the process, the social concerns that gave the Progressive Era its name became embedded in the curricula of America’s universities and liberal arts colleges.  And so remain.49

Meanwhile, the sciences proceeded with their own turn-of-the century academicization.  Although a smaller percentage of early science PhDs took up faculty positions than those in the humanities and social sciences, those who did had an equally high estimate of their standing and were equally alert to their responsibilities for boundary management, in their case warning off “applied scientists’ and those focused on “the industrial side of knowledge and instruction.”  When Columbia underwent its institutional reorganization in the early 1890s prior to changing its name from Columbia College to Columbia University, its chemists, physicists and geologists took for themselves the exalted name of “The Faculty of Pure Science,” leaving Columbia’s engineers to find a lesser place in the University’s distinctly down-market School of Mines.50

Presidents responsible for the wellbeing of four-year colleges responded to this threat from the large universities by introducing more choice into their own curricular offerings.  An early step was to add an alternative track to the AB that did not require the study of Greek or Latin beyond the freshman year.  In their place came electives courses in English literature, European and American history, political economy and sociology, more science and more stress on modern European languages.  While never quite abandoning the idea that the college should set the curriculum, or impose some requirements, they in practice ceded much of the actual choosing to their students.  Game, set, match, Eliot.51

5. What Counter Revolution?

The rise of the elective system and academic professionalization it engendered did not go unchallenged.  In 1870, just months after Eliot’s inauguration, Yale’s President Noah Porter railed against unnamed advocates of revolutionary doctrines; fifteen years later, Princeton’s newly installed James McCosh took on Eliot personally, while defending his inherited fixed curriculum.  Even as their faculties were increasingly staffed by university-trained disciplinary specialists, late 19th-century college presidents took to justifying the continued existence of their institutions as singularly suited to the production of “the whole man.”  This they did by extolling the life-enhancing values derived from residential campus life, from fraternities and athletics, from class spirit — from just about anything except the curriculum, whose control they quietly ceded to their faculty.  Amherst’s President George Harris put the case for four-year colleges in 1905:

The aim of the college is not to make scholars.  The aim is to make broad, cultivated men, physically sound, intellectually awake, socially refined and gentlemanly, with appreciation of art, music, literature, and with sane, simple religion all in proportion; not athletes simply, not scholars simply, not dilettantes, not society men, but all-round men.

As for the undergraduates at these colleges, historian and campus observer Frederick Rudolph has wryly acknowledged, “no student ever confused the curriculum with college.”52

The real threat that free-standing four-year colleges faced in the late 19th century was the prospect of being simultaneously squeezed from below by the curricular enhancement of the newly opened private preparatory schools and  the proliferating urban public high schools, and from above by professional schools accepting applicants with little or no college training.  Such a development posed little threat to undergraduate programs embedded within universities with their array of professional options.  Columbia’s President Barnard went so far in 1869 to urge his trustees to leave undergraduates entirely to the “country colleges,” and have urban-based Columbia specialize in professional and graduate studies.  In 1875 President Daniel Coit Gilman only reluctantly added an undergraduate component to Hopkins when local backers insisted.  In 1884 Columbia professor of political science John W. Burgess, who had left Amherst five years earlier when its trustees rejected his plans for Amherst becoming a university “after the German model,” predicted the demise of the four-year college.  In 1902, Stanford University president David Starr Jordan characterized the liberal arts college as “antiquated, belated, arrested, starved, as the case may be . . .  As time goes on, the college will disappear in fact, if not in name.”53

In the event, it was not Eliot’s thoroughgoing elective that became the curricular norm.  Nor did it long survive Eliot’s retirement in 1909, where it was replaced by a faculty-negotiated distribution/major system wherein students were required to choose from a set introductory courses in designated major fields — sciences, social science and humanities — before settling on a disciplinary major.  By the 1920s this became the industry standard, adopted by university colleges and free-standing colleges alike.54  

Two technical changes in the way colleges recruited students sped the late-19th-century transformation from a classical and largely prescribed curriculum to a non-classical and distribution/major curriculum.  Through most of the 19th-century colleges came by their students locally.  Only Yale and Princeton had relatively national student bodies, whereas Harvard and Columbia drew most of their students from eastern Massachusetts and New York City respectively.  It was only in the 1870s that Harvard realized, as Henry Adams was told by a student, “The degree of Harvard College is worth money in Chicago.”55

So long as most applicants lived nearby, colleges and universities could require that students come to campus to sit for an oral admissions examination administered by members of the faculty.  But when some institutions set out to attract a national clientele, this admissions requirement became a problem.  The solution that Columbia’s enterprising Nicholas Murray Butler came up with in 1898 was a nationally administered set of subject-specific admissions exams developed by what came to be called the College Entrance Examination Board.  Once these exams were widely adopted, they became a force not only for curricular uniformity among preparatory schools but a mechanism allowing colleges to accept alternatives to the earlier admissions standards requiring preparation in Latin and Greek.  They also had the more problematic long-term effect of providing colleges with a predominantly homogeneous applicant pool.  Harvard signed on in 1903 and other early holdouts followed.56

A second late-19th-century development was the proliferation of public high schools, especially in the nation’s largest cities.  New York City’s first public high school opened in 1887; by 1902 the consolidated City had 18 public high schools in operation, three of them for girls.  If local colleges were to compete for a share of these publicly-prepared graduates they had two choices:  to persuade public schools to offer Greek and Latin to its college-bound students, or to accept other academic evidence of college-readiness.  The latter was easier.  In 1896 Columbia and Barnard dropped Greek from their admission requirements, allowing applicants to substitute competency in either German or French.  In 1900 Latin became one of three languages acceptable for admissions.  The dropping  of Greek and Latin from a shrinking list of college prerequisites, the substitution of a modern language as  evidence of college-readiness, the adoption of nationally administered College Boards for admissions purposes, and the clustering of expanded course offerings by fields and within them by departments — with “arts and sciences” largely displacing “the liberal arts” as the collective term of choice — were all turn-of-the century departures from earlier arrangements adopted by most colleges  and adhered to since.57

The closest presidents came to a distinctive curriculum-related claim for four-year colleges was to cite the single-minded (indeed, sacrificial) commitment of their faculty to classroom teaching.  The contrast with the publications-obsessed university-based faculty was not always explicit but never absent.  College faculty, the argument went, conveyed their love for the liberal arts directly to their students, whereas university-based faculty directed their specialized knowledge of their subject to disciplinary peers.  Charles J. Sykes has provided us with a description from his student days of a campus legend, Professor Lewis D. Stilwell, a member of the Dartmouth History Department from 1916 to 1952:  “When a former student once urged Stilwell to try to publish his lectures, Stilwell expressed little interest.  Maybe he would someday, he said, ’after I have retired for a while. Right now I am working on a new lecture I think the guys will enjoy.’”58

A passing participant-observer of the pre-WW I Amherst College faculty identified three types:  “Old Giants,” “Middle Lazies” and “Young, Soon to be Gones.”  The last included those hoping to move to a university, where their commitment to specialized scholarship would be less suspect and where lighter teaching programs would make producing such scholarship possible.  A later Swarthmore College self-study put the distinction neatly:  “A member of the Swarthmore faculty and incidentally a historian, or is he a historian who happens to be at Swarthmore College?”  It mattered in 1937 to President Henry Wriston of Lawrence College (now university) that any member of his history department “should be able to give any of the undergraduate courses in history, most of the courses in government, and with a special effort, one or two in economics or sociology.”59

But what, if any, incentive did university faculty have to teach outside their specialty?  Most saw their roles in terms of producing scholarship — and the next generation of scholars — limited to their particular sub-fields, leaving responsibility for the general instruction of undergraduates to colleagues who, either by designation or choice, relieved their seniors/betters from having to do so.

Two instances of this emergent faculty division-of-labor at universities may suffice.  In 1900–1901, with Eliot’s elective system fully in place, Harvard’s 2,000 undergraduates each took four year-long courses.  They had 411 courses to choose from, but half of all their enrollments were in just 11 courses (most with 200 plus students).  These mega-classes were taught by professors known for their performing skills (and some for their lenient grading) with the assistance of small armies of section men.  This left the other 140 faculty with 400 courses to accommodate the remaining 4000 enrollments, for an average of ten students per course.  The median was even lower, with many classes having only one or two enrollments.  The eminent medieval historian Charles H. Haskins met his teaching obligations in 1902-03 with three classes, respectively enrolling one, four and six students.60

“Eliot’s sleight of hand appeared to liberate the students,” another Harvard historian, Oscar Handlin, has written.  “It gave them a choice few actually exercised, to range far and wide in an expanding curriculum.”  But it did more.  “Reform freed the faculty for scholarship.  Relieved of the chore of hearing recitations, those who wished could devote themselves to learning, with the instruction they offered a spin-off from their own thinking.”61

Similar results obtained at turn-of-the-century Columbia by different means.  The undergraduate curriculum inherited by President Seth Low in 1890 remained the traditional one, long on language instruction and short on electives.  And while professors, notably those attached to the School of Political Science, had limited their teaching responsibilities to juniors and seniors, most other Columbia faculty were still expected to conduct freshmen and sophomore recitations in required subjects.  But rather than expand the curriculum to allow more student choice and allow his increasingly research focused faculty to teach undergraduate courses of their choosing, the Eliot solution, Low introduced a structural reform that exempted many faculty from undergraduate teaching altogether.62

Prior to 1890 the Columbia teaching staff was organized by schools, including the semi-proprietary law school, the School of Mines, and the recently re-integrated medical school (The College of Physicians & Surgeons).  Faculty with primary responsibility for undergraduate instruction were organized as the School of Arts, while the School  of Political Science, founded in 1881, was the forerunner of what became in the early 1890s the University’s three Graduate Faculties: Political Science, Philosophy, and Pure Science.  Under Low, a businessman and ex-mayor of Brooklyn who lacked academic credentials beyond his Columbia AB, the departments of the Graduate Faculties assumed control over faculty appointments.  College faculty membership was at first for a stipulated term, but over time these assignments fell to a department’s youngest members or to the few senior members who preferred teaching undergraduates.  To be a permanent member of the Columbia College faculty had its own pleasures, but it left one’s scholarly bona fides in question.63

6. A Note on Early Collegiate Women and Women’s Colleges

Georgia Female College and Oberlin College, the former generally regarded as the first college for women and the latter the first co-educational college, both got underway in the late 1830s.  Mount Holyoke opened in 1837, as a female seminary rather than a college.  They were followed in the 1840s and 1850s by a handful of state colleges that opened their doors to women, in some cases to secure a sufficiency of students.  Most of these ante-bellum undertakings offered women instruction in the domestic arts rather than the classical curricula of the men’s colleges.  The curriculum of early Mount Holyoke prepared women for careers in the foreign missions.  Both models acknowledged, as my colleague Rosalind Rosenberg has shown, the coercive force of the prevailing social imperative of “separate spheres.”64

The first women’s college to offer its students a curriculum approximating that of the leading men’s colleges was Vassar College, which received its charter in 1861.  The college’s early offerings in science exceeded those of many of its male peers.  The oldest of what in the 1920s came to be “The Seven Sisters,” Vassar was followed by Smith College (1875) and Wellesley (1875), the Harvard Annex (later, Radcliffe, 1879), Bryn Mawr (1885), a re-chartered Mount Holyoke (1888), and Barnard (1889).64

Bryn Mawr explicitly set out to be “the female Hopkins,” even to the point of establishing a graduate school, and Barnard’s initial curriculum was a carbon copy of Columbia’s.  By the turn of the century, all seven sisters had curricula interchangeable with the leading men’s colleges and faculties drawn from the same ranks of university-certified PhDs, which by 1910 included a fair number of women.  It was Bryn Mawr’s president, M. Carey Thomas, who drew the ire of William James — and prompted his uncharacteristically shrill “The PhD Octopus” — for firing one of his best but procrastinating male graduate students  with the dictum:  “No PhD, no job.”  Curricular emulation of their male peers was the watchword of the Sister colleges, not radical curricular experimentation.65

7. Interwar Tweakings of the Liberal Arts

The 1920s occasioned a rapid increase in the numbers of American undergraduates, from 600,000 at the start of the decade to 1,100,000 by 1930.  The Depression and WW II slowed the growth, but not before college-going, once the exclusive option of the children of America’s economically comfortable, had become accessible to the economically aspiring.  Most of the growth in American higher education during the interwar years occurred at the base of the standing academic order, at public full-service universities, at newer satellite campuses and at junior or community colleges.  Most offered vocationally-directed courses in addition to some basic instruction in liberal arts subjects under the rubric of “general education.”66

Meanwhile, Ivy universities and leading liberal arts colleges capped their enrollments, many to attract a more national student body, some by systematically rejecting otherwise academically qualified students because of religious or ethnic backgrounds or propinquity.  At Columbia and Barnard, with Jewish students making up 20% of entering classes, personal interviews of local applicants and a psychological test provided the necessary filters.  Those rejected found places at CCNY, NYU or Seth Low College in Brooklyn, from which a successful student might transfer to one of Columbia’s professional schools after two years, but not to Columbia College.  At interwar Princeton, where Jews and Catholics made up less than 20% of its entering classes, such reassuring admission statistics appeared regularly in the Princeton Alumni.  Amherst and Bryn Mawr posted similar numbers.67

The upshot, almost certainly, was a decline in the market share of the liberal arts in the four-year sector of the interwar American academic economy.  Recent retrospective estimates put the percentage of all collegiate degrees in 1940 awarded in the liberal arts at 40%, with humanities degrees accounting for less than 10%.  In absolute terms, however, the number of liberal arts graduates between 1920 and 1940 likely tripled.  As Roger Geiger has advised, in assessing the health of a sector of the academic universe, “numbers are more important than market share.”68

The interwar period witnessed several well-publicized curricular reform initiatives which will be briefly considered here.  It was at Columbia, the nation’s largest university in the interwar period, where the insulation of senior faculty from undergraduate instruction had proceeded the furthest.  It was also where undergraduates regularly cut short their collegiate careers by one or two years by transferring to one of the University’s professional schools.  But Columbia was also the site of a modest counterattack to both these practices.  It began in 1915 when Associate Professor of English John Erskine (Columbia College 1899; CU PhD 1903), just back from seven years at Amherst, proposed a two-year reading course for selected juniors in Columbia College on “The Great Books."  Erskine’s proposal provoked the usual criticisms:  from the University’s classicists, that The Iliad could not be understood except in the original Greek; from the College dean, that the course opened Columbia up to the charge of “dilettantism.”69

When the United States entered the Great War, Erskine left Morningside Heights for wartime service in France.  Meanwhile, Columbia mounted a one-semester course in “War Aims.”  Designed for the Student Army Training Corps as an exercise in Allied apologetics, it was required of all Columbia students.  When the Armistice eliminated the rationale for such a course, faculty backers in the social sciences proposed a course on “Peace Aims.”  By the fall of 1919 its title had been changed to “Contemporary Civilization,” its purview the troubled history of Europe since 1871.  Throughout the early 1920s, “CC” was a year-long lecture course required of freshmen and taught by members of the College faculty with textbooks written by members of the Graduate Faculty of Political Science.  In 1928 a second year, “CC-B” was added as a sophomore requirement, covering American social and political history since 1870.70

Meanwhile, approval was granted for Erskine’s now year-long “Honors Seminar on Important Books.”  Limited to 20 handpicked juniors, it met weekly and adhered to a discussion format.  By 1925 the course had expanded to five sections, with Erskine’s section jointly taught with his ex-student Mortimer Adler.  For all the loyalty these seminars engendered  among the involved faculty and Columbia’s more bookish undergraduates, Erskine’s “great books” initiative elicited little support from departmental colleagues.  When Erskine resigned from Columbia in 1927, his seminar was discontinued.71

The onset of the Great Depression occasioned changes in the core’s scope, size and rationale.  In 1930 the starting date of “CCA” was pushed back to 1300 to accommodate the Middle Ages, the classes became discussion-sized (20–24 per), and the readings became focused on documents.  By 1941, more than half of a Columbia undergraduate’s first two years consisted of required extra-departmental (inter-disciplinary if you wish) coursework, all of it focused on the high cultural movements of western civilization.  The sciences went unconsidered in the core and science faculty played no role in its staffing.72

Columbia’s interwar move to a prescribed curriculum was prompted by more than pedagogical or cultural considerations.  Demographic and economic factors also played a part.  Unlike many other university heads during the Depression, President Butler resisted substantial cuts in his faculties.  But a drop in graduate registrations and a cap on undergraduates (this to limit the numbers of otherwise academically qualified Jewish public high schoolers), led departments to see participation in the College’s core curriculum as justifying otherwise redundant faculty.  Staffing the core became the academic equivalent of a New Deal public works project.73

By the early 1930s two of “Erskine’s Bookies,” the philosophers Mortimer Adler and Richard McKeon, had decamped Columbia for the University of Chicago and took the “Great Books” brand with them.  There they joined President Robert M. Hutchins’ underway campaign to up-end the University’s heretofore neglected undergraduate program.  The curriculum Hutchins inherited in 1929 was already under faculty-directed revision and set for implementation in 1931.  This “New Plan” consisted of required year-long introductory courses in each of five “knowledge areas” — biological sciences; English composition; humanities; physical sciences; social sciences — to be taught by faculty assigned to the undergraduate division.  These two years of “general education” were to be followed by two more years of advanced undergraduate coursework in one of these five areas, with instruction from the relevant graduate division.74

Initially supportive of the “New Plan,” Hutchins, at the urging of Adler, took to faulting its lack of directedness and emphasis on facts rather than ideas.  He and Adler set about to replace the “New Plan” with their own, which a critic later characterized as “the cultivation of intellectuality for its own sake.”  Hutchins, trained in the law and lacking a PhD, and Adler, who never held a regular appointment at Chicago, led the way with a two-year undergraduate seminar on “Classics of Western Culture.”75

The Hutchins scheme was well received by the national press, but faculty response was more negative, especially among Chicago’s scientists who saw it privileging the humanities over empirical inquiry.  Hutchins also faced opposition from his History Department, which he faulted for excessive specialization:

Now, if the professor of American history gets sick, the professor of English history cannot take his work. And in a university, if the professor of American history from 1860 to 1864 gets sick, the professor of American history from 1865 to 1870 cannot take his work.76

The department responded by moving from the humanities division to the social sciences division. 

Hutchins stayed on at Chicago until 1950, but his last victory, the creation of a free-standing undergraduate faculty in 1942, was promptly dismantled upon his departure.  By then the geographical locus of the Great Books approach to undergraduate education had moved yet again, now to St. John’s College, in Annapolis, Maryland, where Adler and McKeon became frequent visitors.77

A third instance of interwar curricular reform gone awry involved Alexander Meiklejohn, president of Amherst College (1912–23) and author of The Liberal College (1920).  In 1926, after running afoul of the Amherst board, Meiklejohn went to the University of Wisconsin with the expressed purpose of designing an Experimental College, which he then directed for six years.  Students in the College lived together in a university dormitory and had no required classes or exams.  They spent their first year studying Periclean Athens and their second exploring contemporary American society.  The experiment attracted wide notice, placing Meiklejohn on the cover of Time magazine in 1928, but also mobilized opposition among traditionalist faculty and administrators.  With the onset of the Depression, enrollments fell and Meiklejohn left Wisconsin and turned his energies to adult education.  His account of his work in Madison, The Experimental College (1932), is a classic statement on behalf of a core curriculum and a criticism of lecture- and discipline-based higher education.  It is also a cautionary tale attesting to the limits of reform.78

Three general points about these interwar efforts at undergraduate curricular reform:

  1. They all occurred at large full-service universities, where shortening the typical undergraduate education from four to three or even two years before proceeding to professional studies posed little financial threat;
  2. They were promoted by reformers at odds with the prevailing research ethos of these universities, and opposed by most senior faculty who faulted their curricular ideas  as dilettantish and superficial;
  3. They all slighted the sciences, and in the case of Hutchins-Adler-McKeon at Chicago, the social sciences.

One interwar curricular initiative did have its origins at a free-standing college, that identified with President Frank Aydelotte of Swarthmore College.  It consisted (consists) of an honors track, in which a select number of juniors and seniors took two seminars a semester, instead of the normal regimen of four or five courses, both seminars taught by senior faculty in the humanities and social sciences.  An honors track was intended to serve two functions:  allow some of Swarthmore’s best students a preview of graduate school; provide faculty with an instructional format that allowed them to move beyond the basics of their discipline, to draw upon their own ongoing research, and to serve as mentors.79

While distinguishing Swarthmore among American colleges, its honors programs had few imitators.  [Fortunately for me, the University of Rochester was one of them.]  A principled objection often made was that honors tracks are necessarily selective and thus undemocratic.  A more practical consideration is cost, which even the well-endowed interwar Swarthmore required help meeting from the Rockefeller-funded General Education Board.80

A final and collective comment here about the state of liberal arts in the interwar period relates to the American South, where three different kinds of institutions of higher education paid increased curricular attention to the humanities and social sciences.  At established southern universities such as the University of North Carolina and Vanderbilt University, faculty in the 1920s and 1930s took upon themselves a comprehensive examination of the South’s cultural heritage.  At Chapel Hill this led to the creation of undergraduate programs in folklore, rural economics, regional sociology and the history of the South, whereas in Nashville curricular attention turned more to the humanities, where the study of Southern letters, past and present, became an institutional imperative.81

Meanwhile, at many of the South’s black colleges, the vocational curricula long favored by Booker T. Washington and his early patrons gave way to the liberal arts curriculum favored by  W.E.B. DuBois as better suited to preparing the race’s “talented tenth” for professional careers.  This shift was accompanied by a greater willingness on the part of black faculty to follow Du Bois’s lead in the study of race and racism in America, which in turn strengthened the claims of history and sociology to places in the curriculum of black colleges.82

And then there was the short-lived experimental case of Black Mountain College, founded outside Asheville, North Carolina in 1933, and committed to the belief that the performing and creative arts were an essential part of a liberal arts education.  By the time of Black Mountain’s closing in 1957, the idea that the performing and creative arts were too important to be left exclusively to conservatories and art institutes had traction elsewhere, not least at women’s colleges such as Barnard, Bennington and Sarah Lawrence.  Together these developments constituted both a regional renaissance for the liberal arts in the midst of the Great Depression and a foreshadowing of the more national efflorescence of the liberal arts in the brief postwar “golden age” that followed. 83   

8. The Liberal Arts in the “Golden Age of the American University”

The end of World War Two ushered in two decades of unprecedented prosperity for institutions of higher education.  Several factors helped make it so, starting with the pent-up demand for higher education among the millions of young men in the military services produced by six years of a military draft and four years of war.  In 1944 Congress devised a singularly generous means of meeting that demand — in hopes of avoiding a return to massive unemployment — with the Serviceman’s Readjustment Act (aka “The GI Bill”).  College now became an option for many veterans whose economic circumstances would not otherwise have allowed it.  The post-war American public felt equally well-disposed toward faculty, especially those in the sciences who had helped build weaponry essential to the war effort, but also to social scientists and humanities scholars who contributed their knowledge of distant lands and foreign languages to the war effort.84

The onset of the Cold War further enhanced the stature of America’s universities. With Europe ravaged by war and Britain impoverished, the United States became a custodian of western culture, as well as the leader in the West’s response to Marxist communism and an aggressive Soviet Union.  Assumption of these challenges fit nicely into the upward reassessment of American culture in the 1930s by independent scholars such as Van Wyck Brooks and Edmund Wilson and lent themselves to the emergence of American Studies/American Civilization as an academic field and American “exceptionalism” as an area of academic research.  The articulation of a consensual ideology of “liberal anti-communism” became a task post-war American academic scholars enthusiastically took upon themselves, both in print and in the classroom.  The interwar curricular staple, “Survey of Western Civilization,” was now at many institutions matched — and in some instances displaced by — a required course in American history.85

The unexpected but welcomed postwar economic prosperity also helped.  Where enrollment growth in higher education had slowed during the Great Depression and became negative during the war, the five years between 1945 and 1950 experienced a tripling of college-goers, the 2,000,000 veterans among them full-tuition payers.  Growth slowed in the mid-1950s, when the smaller birth cohort of the Depression years reached college age.  But by the late 1950s growth accelerated again with the first of the Baby Boomers and continued into the late 1960s.  By then the percentage of Americans of college age attending college had grown from 15% in 1949 to 35% in 1969.  Governors and legislators rushed to meet the demand for new campuses.  Northeastern states like New York and Massachusetts, which had earlier let private universities and colleges meet most of their higher educational needs, now became major providers of higher education.86

The post-war demand for faculty was equally pressing.  Framed as a shortage of PhDs, meeting it became a focus of the nation’s major philanthropic foundations, among them the Rockefeller Foundation and the Carnegie Foundation, and then, following its reorganization in 1948, the Ford Foundation.  The result was that PhD production between 1949 and 1970 grew from 6,400 to 32,000.  By the late 1960s a slowdown in faculty hiring had begun, but no commensurate slowdown in the production of PhDs.87

To be sure, American academe’s “golden age” did not enrich everyone equally.  Public higher educational institutions grew at a faster rate than private ones, universities more than colleges.  More growth in enrollments occurred at two-year post-secondary institutions than four-year programs.  Professional benefits accrued more to men than women, including graduate fellowships and academic appointments.  Until late in the post-war period, African Americans were excluded from state universities in the South and went unrecruited elsewhere.  While post-war institutions of higher education became more welcoming of Jews and Catholics, both as students and faculty, and women were again in the late 1950s a growing proportion of the college-goers, their claims to fashioning a perdurable “meritocracy “sound forced in retrospect.88

Faculty hiring procedures in the post-war years remained informal and ascriptive.  A recently hired assistant professor at Brown in 1952 described the prevailing process: “Recruiting by the present instructors of men with whom they feel at home.”  His gender attribution was both natural and statistically on point when even women’s colleges gave preference to male faculty applicants.89

While good times continued, the last thing on the minds of university and college presidents, or bountiful state legislatures, was curricular reform.  And what of students?  It was with specific reference to curricular issues in the 1950s that Morton and Phyllis Keller, in Making Harvard Modern, offered a general pronouncement:  “Nothing in the College more consistently engaged the faculty, or less consistently engaged the students.” 90

And what of the foundations?  Did not their generous funding of higher education earn them a place at the curriculum-deciding table?  The case of the Ford Foundation’s International Training and Research Program suggests a limited one.  To be sure, universities welcomed funding to expand offerings in the social sciences and humanities to encompass regions of the world beyond the United States and Western Europe.  While the University of Chicago’s Oriental Institute, Harvard’s Russian Research Center and the University of Pennsylvania’s South Asian Institute all expanded operations with Ford funding, most of the attendant growth came at universities new to area studies [Read Duke and Indiana].  No self-respecting university political science department could expect to move up the rankings without a specialist in South Asia, an Africanist and at least two Sovietologists.  Adequate coverage in Hindi, Bengali and Urdu only revealed a heretofore unacknowledged neglect of Tamil, Marathi and Punjabi.91

Meanwhile, colleges like Amherst and Wellesley added PhDs to their faculty ranks trained on the foundation’s nickel.  Their curricular offerings broadened accordingly.  Much as the competition among under-financed 19th-century colleges made curricular innovation too risky, that among newer public institutions and the older privates kept curricular reforms to a minimum.  Competition may have actually increased the attractiveness of the liberal arts as the academic fare of choice.  Administrators and faculty at newly created branches of a state system [think Michigan State, UC Davis] competed with their peers on the “flagship” campus on a department-by-department basis, creating “pillars of excellence” as they went.  For new four-year programs at what had been normal schools [Read SUNY Geneseo] the challenge was to provide the same liberal arts curriculum as local private colleges, just cheaper.92

Post-war Macalester College provides a special instance of this general trend.  Founded in 1885, the St. Paul-based institution operated into the 1950s as a regional four-year college under Presbyterian auspices, serving Minnesota and the Midwest.  In the 1960s the college’s financial situation brightened with the first of many large gifts from DeWitt and Lila Wallace, owners and publishers of the Reader’s Digest.  Macalester promptly set out to become a nationally recognized liberal arts college, in the process of which it dropped most of its vocational offerings—nursing, elementary education, medical technology—in favor of a full retinue of offerings in arts and sciences.  As older, regionally recruited and modestly credentialed faculty retired, Macalester mounted an aggressive national recruitment program to fill its tenured ranks with established scholars. 93

Engineering schools and universities best known for their engineering programs faced their own competitive challenge in the post-war era.  For Stanford and MIT, Carnegie Mellon and Georgia Tech, Lehigh and Rice, the post-war challenge was to be accepted as full-service universities, not merely schools of technology.  This became more pressing with the rise of the departmental-rankings system developed under the auspices of the American Council on Education in the early 1960s.  Thus Stanford became a lively center for the humanities, MIT developed strengths across the social sciences.  Emulation, not innovation, was the name of the game.94

A final distinctive feature of the post-war era that helps account for its curricular stasis was the unprecedented interest undergraduates showed in pursuing academic careers.  This was most pronounced at the more selective colleges, where upwards of a quarter of graduating classes in the late 1950s and early 1960s went directly on to graduate school in the arts and sciences.  The availability of Woodrow Wilson fellowships, paid teaching assistantships and, beginning in 1958, National Defense Education Act fellowships, made doctoral studies more affordable than law or business or medical school.  So long as the brightest students aspired to the positions their teachers held, and an expanding academic economy gave promise of occupationally accommodating them, a liberal arts education in a recognized discipline was in fact pre-professional — and all the more secure from disruption for being so.  Meanwhile, ample workforce opportunities beckoned less academically directed liberal arts graduates into corporate entities with job-training programs and career-length employment expectations.  Statistics and C++ had yet to be prerequisites for job-interview call backs. 95

One quantifiable result was that between 1949 and 1967 the market share of baccalaureate degrees in the humanities increased from 9% to 17%.  Meanwhile, the market share of social science ABs increased modestly.  Both gains were at the expense of the sciences, mathematics and engineering, which experienced declining market shares.  That this occurred despite public worries attending Sputnik, missile gaps, moon launches and heavy government investment in defense-related scientific research is surprising.  One possible reason for this was that women, who by the early 1960s again made up 45% of college-goers, had little incentive to pursue  degrees in fields where they were occupationally proscribed, choosing instead to major in  the humanities or the “softer“ social sciences (history, anthropology) for cultural purposes or jobs in secondary education.  Whatever the reasons, the unprecedented spike the humanities enjoyed in the post-war years among college-goers proved short-lived.  It also, one dares suggest, provides an unrealistically high benchmark against which to measure the humanities’ subsequent fortunes.96

9. All Fall Down

The break-up of the post-war political consensus in which universities had enjoyed a privileged part has many starting dates.  But most observers see it over by the spring of 1965, when the first University Teach-Ins were organized and campus disruptions became the new norm.  Dissent over American foreign policy in Southeast Asia, tensions within the racial justice movement, second thoughts about urban renewal, and, just below the surface, systemic exclusionary issues affecting women, all found a place on the societal charge sheet that radical students posted on the doors of their elders’ institutions.97

Prominent among these compromised institutions were the nation’s major universities.  Where a decade earlier a Jack Weinberg, a leader of the Free Speech Movement on the University of California, Berkeley campus, might have been one of those eager undergraduates emulating his academic elders, in the fall of 1964 he warned his Berkeley compatriots, “Don’t trust anyone over 30.”  There he was joined by Mario Savio, then a doctoral student in philosophy, who dismissed Clark Kerr, the president of the University of California and one of the nation’s most respected academic leaders, as “that well-meaning liberal.”  Kerr, Columbia’s Grayson Kirk, Cornell’s James Perkins and Harvard’s Nathan Pusey all underwent job-ending attack.  Faculty were less often collectively targeted, unless connected with military research or calling for protesters to be expelled.  Indeed, on many campuses, student protesters could count on the tacit support of a segment of the faculty.98

What is notable for present purposes about the student protests is what went largely unprotested.  One was the prerogative of faculty to mandate the curriculum.  Another was their right to pick their colleagues.  The student agendas on three campuses are illustrative.  At Berkeley, in the fall of 1964, students objected to administrative highhandedness in limiting political activity on campus.  The problems of huge classes and distant professors, which Kerr and others had earlier identified, went largely unremarked upon by protesters.  At Columbia in the spring of 1968, protests centered on the University’s role in the war economy and in the neighborhood, and on accommodating recently enrolled black students.  Only later did students of a more transactional temper request a voice in determining the curriculum and the hiring of faculty.  At Harvard in 1969, a demand was made that no courses be allowed “other than those asked for, and deemed significant, by students.”  The Harvard Crimson launched a short-lived series calling for a student-designed-and-student-taught curriculum.  Official Harvard responded with allowing  a trial run of paired courses — “Social Change in America” and “The Radical Perspective” — offered by a junior member of the politically suspect Social Relations Department.  Despite 750 enrollments, the courses were not repeated.99

One area where curricular action did happen was with on-campus NROTC programs.  A late-1940s Cold War concession to the need to accommodate the military on campus, the lure of double-tuition payments, and the wish to be included in the Navy’s exclusive club of 52 Ivies and top public universities, these programs operated into the mid-1960s on faculty sufferance.  The subjects taught, among them navigation and the history of sea power, some faculty deemed vocational.  Others complained that in-place faculty had no voice in selecting the program’s instructors, generally career officers assigned by the Department of the Navy.  When students at Columbia, Harvard and elsewhere protested their presence on campus as part of their anti-war agenda, they found their faculties ready to go along with their removal.100

10. Children of the Fall

Some relevant numbers:  Between 1970 and 2015 enrollments in institutions of post-secondary education nearly doubled, from 11,000,000 enrollments to 20,000,000, while the percentage of Americans of college-age attending college grew from 35% to 50%.  In 1970 women made up 40% of all enrollments, in 2015, 56%.  Most of the overall growth occurred in educational institutions where the liberal arts at best have a marginal place in the curriculum.  These include for-profit institutions, which in 2015 accounted for about 10% of all post-secondary enrollments.  The number of private colleges where liberal arts remain the principal instructional fare dropped from 500 in 1970 to under 200 in 2015, when they enrolled only 6% of all college-goers.  Only at the nation’s 1100 community colleges, which enroll about a third of all post-secondary students and where a commitment to the liberal arts under the rubric of “general education” remains robust, has there been an increase in degree completions in the humanities, from 120,000 in 1987 to 350,000 in 2015.101

In 1967, ABs in the humanities accounted for 1 in every 6 ABs (17%) awarded nationally; in 2015, 1 in 8 (12%).  This loss in market share occurred in two waves, the first beginning around 1970 and continuing into the mid-1980s, when the 15-year slide slowed and may have stopped, only to start downward again after 2002.  The current trend line for the humanities points downward, with the steepest declines in foreign languages and English.102

During this same period, the market share of the social sciences remained relatively flat, with economics after 1991 increasing and history losing market share.  Meanwhile, the  sciences (along with mathematics and engineering), thanks in part to a vigorous campaign begun by the National Science Foundation (in 1991) on behalf of STEM majors — Science, Technology, Engineering and Mathematics —became the largest and fastest growing of the arts and sciences.  Visual and performing arts also achieved a larger presence, in 2015 representing 6 % of all AB recipients, nearly the market share of all the humanities.103

These national data match up with my own quick check of shifts in majors since 2007 at Yale, Carleton, Wellesley, Hamilton and Williams.  On all these campuses, the sciences have overall gained and the humanities lost market share, while the social sciences have mostly held their own.  The increase in science majors has been across the board and not confined to computer science; the declines in humanities majors have been pretty comprehensive, with the biggest drops occurring in English and foreign languages; in the social sciences there has been a shift away from the “softer” and less quantitative disciplines of history and anthropology and toward economics.  This said, it bears repeating that these are movements occurring within the liberal arts and are less suggestive of a sinking ship than a rearranging of deck chairs.104

How then do we account for the widely reported five-decade decline of the liberal arts as major subjects of study among college-goers?  We can begin by suggesting that it is limited to the humanities.  And further, that that decline was more in market share than in absolute numbers, where the decline in humanities ABs awarded annually declined from 100,000 in 1967 to 90,000 in 2015.  Much of the decline in the humanities as a major subject among college-goers may be attributable to the worsening job prospects for humanities majors in the late 1960s which turned a generation of would-be humanities professors into lawyers and financial analysts.  For bright collegians following in their wake, a major in political science or economics or computer science seems a safer bet than English or philosophy or religion.105

Another factor contributing to the slump in humanities ABs turns on gender.  By the early 1970s, college-going women had achieved numerical parity with men; since then, their majority status has increased steadily until in 2015 women accounted for 57% of all college goers.  During these years their historic previous exclusion from many of the nation’s leading colleges ended in a rush, beginning with Yale’s decision to admit women in the fall of 1968 and cresting with Columbia College, the last of the Ivies to become co-educational.  Perhaps more importantly, where women in college previously chose their majors with limited reference to future careers, expanded occupational opportunities in law, medicine, engineering, corporate management and academe changed all that.  Now, as traditionally so with men, choosing a college major for women became part of a longer-term plan, one that might involve professional studies, but almost certainly, married or not, envisioned a career.  No longer was choosing a major made for what the economist Claudia Goldin has called “cultural enrichment, “ which once recommended a major in Italian or comparative literature, but now one in economics or computer science.  Unless, of course, today’s female undergraduate has in mind becoming a professor of Italian or comparative literature, in which case, while still a longshot, chances are better of her succeeding than back in the “golden age.”106

11. A Current Tale of Two Cities

The onset of tough times affected faculties of major research universities and leading liberal arts colleges differently.  At Columbia, for example, declines in the numbers of graduate students in arts and sciences allowed administration officials, strengthened by economic exigencies, to persuade previously resistant departments to make more use of their senior faculty in undergraduate instruction.  While one hopes this represented a net gain in the quality of undergraduate instruction, it did not change the standing curricular order.107  

At leading liberal arts colleges, tough times have had a different impact on faculty.  With fewer members moving to university appointments, and fewer retiring at 65, fewer slots have opened up and fewer junior faculty tenured.  Beginning in the 1970s colleges found themselves unable to offer tenure to junior faculty who earlier would have been accommodated.  Whom to cut loose and whom to retain became exacerbated on previously all-male campuses because the squeeze began just as they were facing the gender rebalancing incumbent upon the arrival of women students.  To complicate matters further, the early 1970s ushered in federally mandated hiring regulations that made the “old-boys” pipeline inoperative.108

While some moved faster than others, all leading liberal arts colleges set aside earlier reliance on local assessments of teaching effectiveness, collegiality and institutional “fit” as the key criteria for tenure. In their place came scholarly productivity and professional standing.  While the earlier criteria remained necessary for tenure, they no longer sufficed.  Tenuring became less a decision determined locally — “Is he good for here?” — and more a disciplinary matter in the hands of outside reviewers — “Where does she stand in her field?”109

This change has had an impact on the curriculum of the leading colleges.  For junior faculty irrespective of gender, it became imperative that they teach within their discipline on topics complementary to their research, and that they be spared teaching large survey courses in favor of advanced seminars.  It also meant their avoiding like the plague any interdisciplinary coursework that might suggest to outside reviewers a lack of disciplinary loyalty.  The title Amherst College faculty gave to a collection of essays in 1991 captures the professional imperative of faculty of the fall: Teaching What We Do.110

12. What Is To Be Done?

This essay has focused on major research universities and the leading private liberal arts colleges because I believe that they are the natural habitat of the liberal arts in America.  I also believe they are also where the liberal arts have the best prospects of surviving downbeat times.  Further, I have focused on the faculties of these institutions because that they have been — and remain — the liberal arts’ doughtiest defenders, for self-interested as well as disinterested reasons.

Neither the hundreds of two-year community colleges, despite their substantial investment in the liberal arts in their general-education offerings, nor the two dozen public liberal arts colleges, reliant on ever harder-to-come-by state funding, can be counted on to resist societal pressures for a higher educational system tied to job-getting imperatives.  Even less can the handful of for-profit institutions that offer instruction in the liberal arts.  At many of the once-but-no-longer liberal arts colleges that David W. Breneman identified in 1994, faculty have ceded control of the curriculum to “enrollment managers” and outside consultants, while “contingent” faculty do more and more of the teaching (a fact of contemporary academic life not limited to struggling colleges).  Nor finally, can the liberal arts look for defenders among students, particularly minority and first-generation students, for whom the liberal arts can seem the play things of the well-born.  Ditto their financially pressed parents seeking a bankable return on investment.  None have the stake in the preservation of the liberal arts that liberal arts faculty do.111

If one accepts the argument that self-interested faculties at research universities and leading colleges have for the last 140 years largely determined the content of the undergraduate curricula, has that been such a bad thing?  Who else would have done a better job?  Administrators or trustees, donors, public officials or conservative critics?  Have students thereby suffered or been short-changed?  Have the professions been diminished or society impoverished?  At worst, charges of faculty malfeasance merit the Scotch verdict:  “Not proven.” 

Specifically, how valid is the charge that faculty control has stifled curricular innovation?  The durability of field and discipline labels obscures important changes that have transpired within them.  Witness shifts in subject matter and methods during my own career as an historian and social scientist.  Where diplomatic, political and intellectual history once held sway, race, gender and the environment are now featured.  Similarly, colleagues in the humanities and “softer” social sciences have welcomed technological developments into their teaching, research and publishing venues.  Meanwhile, changes in the economic and racial makeup of our students have been accommodated in some measure by what and how we teach.112

This is not to minimize the departmental staffing problems in the humanities attendant upon enrollment declines brought about by students (and their tuition-paying parents) voting with their feet.  Nor is it to suggest that the dictum “if it ain’t broken, don’t fix it” applies to the current state of the liberal arts in America.  It is to be historically mindful that the squeeze that English departments are now experiencing, classics departments experienced decades ago.  And yet the classics remain a vital part of the nation’s academic and intellectual fare.  If a return to the heady market-share levels of the humanities in the 1960s is not in the cards, given newer and equally legitimate demands by applied sciences and computing for places in the undergraduate curriculum, so is the current dip in absolute numbers unlikely to continue in the face of a growing recognition of the practical utility — what William James risked calling “the cash value” — of the humanities in solving open-ended questions, encouraging collaboration and fostering critical reasoning.  What’s not worth keeping?

Meanwhile, it behooves those universities and colleges that have traditionally hosted the liberal arts in America, as well as the philanthropic foundations that for more than a century have helped underwrite them, to continue to do so — even as they address in the real world of competing demands the always unpopular question as regards the humanities or any other field of academic endeavor:  How much is enough to assure survival and encourage renewal?

For the anxious untenured assistant professor of philosophy or the stressed chair of the religion department living in an era when the humanities are losing market share to the surging STEM fields and the quantitatively-enabled social sciences, two suggestions.  First, the concept of “market share” is derived from the world of commerce and is a poor measure of academic impact.  “A teacher,” the erstwhile Harvard instructor of medieval history, Henry Adams, reminds us, “affects eternity.  He can never tell where his influence stops.”  Gender specificity aside, sustaining advice.  And so, as per the World War II British poster, we may yet do well to “Stay Calm and Carry On.”

End Notes

1.  John Henry Newman, The Idea of a University (1852).

2. Louis Menard, The Marketplace of Ideas (New York:  W.W. Norton, 2010), 16; William James, “The Social Value of the College-Bred,” Memories and Studies (1911), 309.  See also W. B. Carnachan, The Battleground of the Curriculum (Stanford University Press, 1993).

3. Eric Ashby, The Rise of the Student Estate in Britain (Cambridge:  Harvard University Press, 1961).

4. Samuel Eliot Morison, The Founding of Harvard College (Cambridge:  Harvard University Press, 1935), 3–91.

5. Jurgen Herbst, “Translation Study: The Transfer of Learning from the Old World to the New,” History of Higher Education Annual, 1992, pp. 85–99; J. David Hoeveler, Creating the American Mind: Intellect and Politics in the Colonial Colleges (London:  Rowman and Littlefield, 2002).

6. Morison, The Founding of Harvard College, 92–107.

7. Morison, The Founding of Harvard College, Appendix D, 419–447.

8. Samuel Eliot Morison, Three Centuries of Harvard (Cambridge: Harvard University press, 1936), 24.

9. Josiah Quincy, History of Harvard University (Boston, 1860); Samuel Eliot Morison, Three Centuries; Winthrop Hudson, “The Morison Myth Concerning the Founding of Harvard College,” Church History, 8 (June 1939), 148–159; Harry S. Stout, “University Men in New England, 1620-1660:  A Demographic Analysis,” Journal of intellectual History, 4 (Winter 1974), 375–400.

10. Bernard Bailyn, “Foundations,” in Bernard Bailyn, Donald Fleming, Oscar Handlin, Stephan Thernstrom, Glimpses of Harvard Past (Cambridge:  Harvard University Press, 1986), 1–18.

11. On William and Mary, Jurgen Herbst, “The First Three American Colleges: Schools of the Reformation,” Perspectives in American History, 8 (1974), 7–52.

12. Richard Warch, School of the Prophets: Yale College, 17011740 (New Haven:  Yale University Press, 1973), 1–40.

13. Jurgen Herbst, From Crisis to Crisis: American College Government, 16361819 (Cambridge:  Harvard University Press, 1982); Thomas S. Kidd, The Great Awakening: The Roots of Evangelical Christianity in America (New Haven:  Yale University Press, 2007).

14. David C. Humphrey, From King’s College to Columbia, 17461800 (New York:  Columbia University Press, 1976), 231–266.

15. Robert A. McCaughey,Stand, Columbia: A History of Columbia University in the City of New York, 17542004 (New York:  Columbia University Press, 2003), 29.

16. Ibid., 36.

17. Mark A. Noll, Princeton and the Republic, 17681822 (Princeton:  Princeton University Press, 1989).

18. Richard A. Harrison, Princetonians, 17691775: A Biographical Dictionary (Princeton:  Princeton University Press, 1980).

19. Bernard Bailyn, The Ideological Origins of the American Revolution (New York:  Knopf, 1965).

20. Roger Geiger, The American College in the Nineteenth Century (Nashville:  Vanderbilt University Press, 2000).

21. Benjamin Rush, “A Plan for the Establishment of Public Schools and the Diffusion of Knowledge in Pennsylvania…,” (1786), and “Thoughts Upon Female Education,” (1787),  in Frederick Rudolph, ed., Essays on Education in the Early Republic (Cambridge:  Harvard University Press, 1965), 3–23, 24–40; Louis C. Hatch, The History of Bowdoin College (1927), 19.

22. Allan Nevins and Milton Halsey Tomas, eds., The Diary of George Templeton Strong, 18351849, Vol 1 (New York: The Macmillan Company, 1952), 6; Frederick Rudolph, Curriculum: A History of the American Undergraduate Course of Study Since 1636 (San Francisco: Jossey-Bass, 1977).

23. Thomas Jefferson to Peter Carr, September 7, 1814, in Adrienne Koch and William Peden, eds., The Life and Selected Writings of Thomas Jefferson (New York: The Modern Library, 1944), 643–649.

24. Richard Hofstadter and Walter Metzger, The Rise of Academic Freedom in the United States(New York: Columbia University Press, 1955)

25. Colin Burke, American Collegiate Populations: A Test of the Traditional View (New York: NYU Press, 1982);  The Wilson Smith, “Apologia pro Alma Matre,“, in Stanley Elkins and Eric McKitrick, eds., The Hofstadter Aegis: A Memorial (New York: Alfred Knopf, 1974), 125–153.

26. Michael C. Pak, “The Yale Report of 1828: A New Reading and New Implications,” History of Education, 48 (February 2008), 30–57.

27. Committee of the Corporation and Academical Faculty, Reports on the Course of Instruction in Yale  College (New Haven: 1828). 

28. Robert A. McCaughey, Josiah Quincy: The Last Federalist, 1772–1864 (Cambridge: Harvard University Press, 1974), 152–162; McCaughey, Stand, Columbia, 103–107, 146–152.

29. Eliot quote in Donald Fleming, “Eliot’s New Broom,” Glimpses of Harvard, 64.

30. Codman Hislop, Eliphalet Nott (1971); Robert A. McCaughey, A Lever Long Enough: A History of Columbia’s School of Engineering and Applied Science Since 1864 (New York: Columbia University Press, 2014), 20.

31.McCaughey, Stand, Columbia, 87–89.

32. McCaughey, Josiah Quincy, 132–162.

33. Ibid., 163–194

34. Ibid.

35. Francis Wayland, Thoughts on the Present Collegiate System (1841); Walter C. Bronson, The History of Brown University, 17641914 (Providence: 1914); Donald H. Fleming, Science and Technology in Providence, 17601914 (Providence: Brown University, 1952).

36. McCaughey, A Lever Long Enough, 12–13.

37. McCaughey, Stand, Columbia, 117–143.

38. Laurence R. Veysey, The Emergence of the American University (Chicago: University of Chicago Press, 1965); Edward Shils, “The Order of Learning in the United States: The Ascendancy of the University,” Alexandra Oleson and John Voss, eds., The Organization of Knowledge in Modern America, 1860–1920 (Baltimore: Johns Hopkins University Press, 1979).

39. Carl Diehl, Americans and German Scholarship, 1770–1870 (New Haven: Yale University Press, 1978); Lenore O’Boyle, “Learning for Its Own Sake: The German University as Nineteenth-Century Model,” Comparative Studies in Society and History, 25 (January 1983) 3–25. 

40. Allan Nevins, The State Universities and Democracy (Urbana: University of Illinois Press, 1962).

41. Unpublished research of the author on the first generation of American-trained PhDs.

42. Hugh Hawkins, Between Harvard and America: The Educational Leadership of Charles W. Eliot (New York: Oxford University Press, 1972); Ronald Story, The Forging of an Aristocracy: Harvard and the Boston Upper Class, 18001870 (Middletown: Wesleyan University Press, 1980).

43. Hugh Hawkins, Pioneer: A History of the Johns Hopkins University, 1874-1889 (Ithaca: Cornell University Press, 1960); 33.

44. Robert A. McCaughey, “The Transformation of American Academic Life: Harvard University, 1821–1892,” Perspectives in American History, VIII (1974), 239–332.

45. Ibid., 287–291; Hawkins, Pioneer,

46. George Santayana, Character and Opinion in the United States (New York, 1920), 142–143.

47. John Barnard, From Evangelization to Progressivism at Oberlin College, 18661917 (Columbus: Ohio State University Press, 1969).

48. John Higham, “The Matrix of Specialization,” in Alexandra Oleson and John Voss, eds., The Organization of Knowledge in Modern America, 18601920 (Baltimore: The Johns Hopkins University Press, 1979), 3–18.

49. Dorothy Ross, “The Development of the Social Sciences,” Oleson and Voss, The Organization of Knowledge, 107–138.

50. McCaughey, Lever Long Enough, 51–54.

51. Rudolph, Curriculum,

52. George E. Peterson, The New England College in the Age of the University (Amherst: Amherst College Press, 1964); 44.

53. McCaughey, Stand Columbia, 170–171; John W. Burgess, The American University When Shall It Be? Where Shall It Be? What Shall it Be? (1884), reprinted in in Burgess, Reminiscences of an American Scholar (New York: Columbia University Press, 1934), 349–368.

54. Richard Norton Smith, The Harvard Century: The Making of a University to a Nation (New York: Simon & Schuster, 1986), 72–75; Rudolph, Curriculum.

55. Henry Adams, The Education of Henry Adams (New York: The Modern Library, 1931), 305–06.

56. Michael Rosenthal, Nicholas Miraculous: The Amazing Career of the Redoubtable Dr. Nicholas Murray Butler (New York: Farrar, Straus & Giroux, 2006), 93–94.

57. McCaughey, Stand, Columbia, 213; On the importance of competition and prestige in the emergent university system, I am indebted to the work of Michael S. Pak.

58. Charles J. Sykes, Hollow Men, 90.

59. John Erskine, My Life as a Teacher (Philadelphia: Lippincott, 1948); McCaughey, Stand, Columbia, 285–290; Commission on Educational Policy: Critique of a College (Swarthmore College, 1967), 52.

60. Oscar Handlin, “Making Men of the Boys,” Glimpses of Harvard, 60.

61. Ibid.

62. McCaughey, Stand, Columbia, 178–182.

63. Ibid.

64. Mabel Newcomer, A Century of Higher Education for American Women (New York: Harper & Brothers, 1959); Rosalind Rosenberg, Beyond Separate Spheres: Intellectual Roots of Modern Feminism (New Haven:  Yale University Press, 1982).

65. Helen Lefkowitz Horowitz, The Power and Passion of M. Carey Thomas (New York: Alfred A. Knopf, 1994); William James, “The Ph.D Octopus,” (1903) Memories and Studies, 329-347.

66. David O. Levine, The American College and the Culture of Aspiration, 1915–1940 (Ithaca: Cornell University Press, 1986).

67. Harold Wechsler, The Qualified Student: A History of Selective College Admission in America, 18701970 (New York: Wiley, 1977); Marcia Graham Synnott, The Half-Opened Door: Discrimination and Admissions at Harvard, Yale and Princeton, 1900–1970 (Westport: Greenwood, 1979); Dan A. Oren, Joining the Club: A History of Jews at Yale (New Haven: Yale University Press, 1985); James Axtell, The Making of Princeton University (Princeton: Princeton University Press, 2010); McCaughey, Stand, Columbia, 256–276;

68. Roger Geiger, “Taking the Pulse of the Humanities Indicators: Higher Education in the Humanities Indicators Project,” Humanities Indicators Prototype (American Academy of Arts and Science, 2009), 1–15.

69. Timothy P. Cross, An Oasis of Order: The Core Curriculum at Columbia College (New York: Columbia University, 1995), 1–23.

70. McCaughey, Stand, Columbia, 285–299.

71. Cross, Oasis of Order, 24–86.

72. For a recent appreciation of the Columbia core curriculum, David Denby, Great Books: My Adventures with Homer, Rousseau, Wolff and Other Indestructible Writers of the Western World (New York:  Simon & Schuster, 1996); see also Andrew Delbanco, College: What It Was, Is, and Should Be (Princeton: Princeton University Press, 2012).

73. McCaughey, Stand, Columbia, 285–299.

74. Ibid.

75. McCaughey, Stand, Columbia, 296; Patricia Grieve et al., “Reflections on Columbia College’s Core Curriculum,” (unpublished report of the Committee on the Core, Columbia University, 2009).

76. John W. Boyer, The University of Chicago: A History (Chicago: Chicago Scholarship Online, May 2016); Mary Ann Dzuback, Robert Maynard Hutchins: Portrait of an Educator (Chicago: University of Chicago Press, 1991). 

77. Boyer, op. cit.

78. Ibid.,

79. Philo A. Hutcheson, “In the President’s Opinion: Robert Maynard Hutchins and the University of Chicago Department of History,” History of Higher Education Annual (1997), 33–52. 

80. Hutcheson, “Hutchins,” 36.

81. On Alexander Meiklejohn, American National Biography,

82. On Aydelotte, Burton R. Clark, The Distinctive College (1970).

83. Ibid.

84. Stuart W. Leslie, The Cold War and American Science: The Military-Industrial-Academic Complex at MIT and Stanford (New York: Columbia University Press); on “Golden Age,” David Riesman and Christopher Jencks, The Academic Revolution (New York: 1968). 

85. Gilbert Allardyce, “The Rise and Fall of the Western Civilization Course,” The American Historical Review, 87 (June 1982), 698-725; Peter Novick, That Noble Dream: The “Objectivity Question” and the American Historical Profession (Chicago: University of Chicago Press, 1988), 281–301.

86. Clark Kerr, The Uses of the University (Cambridge: Harvard University Press, 1964).

87. McCaughey, Stand, Columbia, 411–13; National Science Foundation, Division of Science Resources Statistics, U.S. Doctorates in the 20th Century, NSF 06-319.

88. Recent critique of “meritocracy” claims

89. Fleming, Technology in Providence.

90. Morton Keller and Phyllis Keller, Making Harvard Modern:  The Rise of America’s University (New York:  Oxford University Press, 2007), 42.

91. Robert A. McCaughey, International Studies and Academic Enterprise:  A Chapter in the Enclosure of American Learning (New York: Columbia University Press, 1984).

92. Ibid., 205–06.

93. William G. Bowen and Eugene M. Tobin, Locus of Authority: The Evolution of Faculty Roles in the Governance of Higher Education (Princeton: Princeton University Press, 2016), 291–314.

94. McCaughey, Lever Long Enough, 116-120; David Kaiser, “Elephant on the Charles: Postwar Growing Pains,” David Kaiser, ed., Becoming MIT: Moments of Decision (Cambridge: MIT Press, 2010), 103–122.

95. Allan M. Carter, Ph.D.’s and the Academic Labor Market: A Report Prepared for the Carnegie Commission on Higher Education (New York: McGraw-Hill, 1976) 

96. Roger Geiger, “Taking the Pulse of the Humanities Indicators: Higher Education in the Humanities Indicators Project,” Humanities Indicators Prototype (American Academy of Arts and Science, 2009), 1–15.

97. McCaughey, Stand, Columbia, 423-461; Keller and Keller, Making Harvard Modern, 297–306.

98. Clark Kerr, The Gold and the Blue: A Personal Memoir of the University of California, 1949–1967, Vol. 2 (Berkeley: University of California Press, 2003), Max Heirich, The Spiral of Conflict: Berkeley 1964 (New York: Columbia University Press, 1971)

99. Clark Kerr, The Uses of the University (Cambridge: Harvard University Press, 1963); McCaughey, Stand, Columbia, 423-461; Keller and Keller, Making Harvard Modern, 312–313; Roger Rosenblatt, Coming Apart: A Memoir of the Harvard Wars of 1969 (Boston: Little, Brown, 1997).      

100. Full disclosure: I was successively an NROTC midshipman at the University of Rochester (1957–61), an instructor of Naval Science at the University of North Carolina NROTC Program (1963–65) and supported the restoration of the NROTC Program at Columbia University in the early 2000s.

101. National Center for Educational Statistics —

102. Humanities Indicators, American Academy of Arts & Sciences,

103. On the rise of STEM, “Science, Technology, Engineering and Mathematics: Education for Global Leadership” U.S. Department of Education —

Yale Graduates by Selective Majors, 2007 and 2016




# Change

% Change





- 49%










+ 14%

Life Sciences





Physical Sciences





Carleton College Graduates by Selective Majors, 2006 and 2016




% Change




-  33%

Foreign Languages



- 28%

History & S/Sciences



- 16%





Computer Science



+ 331%

Wellesley Graduates by Selective Majors, 2006-2016




% Change




- 40%




- 28%

Foreign Languages




Social Sciences



- 9%

Physical Sciences



+ 43%

Life Sciences



+ 88%

Computer Science




Hamilton College Graduates by Selective Majors, 2006 and 2016




% Change




- 48%




- 29%

Social Sciences




Foreign Languages



+ 14%

Physical Sciences




Computer Sciences



+ 111%

Life Sciences




Williams College Graduates by Selective Majors, 2011 and 2016




% Change




- 26%

Foreign Languages



- 15%





Social Sciences




Life Sciences




Physical Sciences



+ 50%

Computer Science



+ 82%

105. Doctorate Recipients from US Universities, 2014, (Washington: National Science Foundation December 2015); Laura McKenna, ”The Ever Tightening Job Market for PhDs,” The Atlantic, April 21, 2016.

106. Nancy Weiss Malkiel, “Keep the Damned Women Out”: The Struggle for Coeducation (Princeton: Princeton University Press,  2016) ; Ben Schmidt, “Gender and the Long-Term Decline in Humanities Enrollments,” Sapping Attention (blog), June 26, 2013, ; Claudia Goldin, “The Quiet Revolution That Transformed Women’s Employment, Education and Family,” The American Economic Review, Vol.  96, No. 2 (May, 2006), 1–21.

107. McCaughey, Stand, Columbia, 575–76.

108. Robert A. McCaughey, Scholars and Teachers: The Faculties of Select Liberal Arts Colleges and Place in American Higher Learning (New York: Andrew W. Mellon Foundation, 1994), 29–63.

109. Ibid., 64–87.

109. Amherst College Faculty, Teaching What We Do(Amherst: Amherst College Press, 1991).

110. Personal observation of author as member of Barnard College faculty (1969 –  ), chair of history department (1983–87; 1995–98) , Dean of Faculty (1987–1994), and elected member of Academic Tenure and Appointments Committee (1997–2001).

111. David W. Breneman, Liberal Arts Colleges: Thriving, Surviving, or Endangered? (Washington: The Brookings Institution, 1994).

112. Some Barnard instances of ongoing change: When I joined the 10-membered History Department in 1969, its area coverage was limited to Western Europe and the United States, with one member also covering Russia and Eastern Europe. In 2017 the 15-member department includes two Latin Americanists, an Africanist, a South Asianist and an East Asianist; two of its Americanists work on African American history and two on Native American history.  Six do research in women’s history and two in environmental history.  In 1969 the department was 60% male; in 2017 it is 60% female. In 1969, its 10 members were white; in 2017, 5 of its 16 members are persons of color. In 1969 the Barnard student body as 85% non-Hispanic white; the Class of 2019 is 50% non-Hispanic white.


Brief Secondary Bibliography

Bowen,  William G., and Eugene Tobin, The Locus of Authority: The Evolution of Faculty Roles in the Governance of Higher Education. Princeton University Press, 2016.

Delbanco, Andrew, College: What It Was, and Should Be.  Princeton University Press, 2012.

Menand, Louis, The Marketplace of Ideas.  Norton, 2010.