Should College Get Harder?
Around twenty years ago, when I was a graduate student in English, I taught a class in a special observation room at my university’s teaching center. My students and I sat around a long oval table while cameras recorded us. I can’t remember which novel we discussed, but I do know what I learned when I watched the tape afterward, with a teaching coach. She pointed out that, when I was calling on students, I often looked to my right, missing the raised hands on my left. I didn’t let silences go on long enough, instead speaking just when a student had worked up the courage to talk. On the plus side, she noticed I’d been using a technique she liked, which I’d borrowed from a professor of mine: it was like cold-calling, except that, after you’d surprised a student with a challenging question, you told them that you’d circle back in a few minutes, to give them time to consider what they’d say. This, she told me, was “warm-calling.”
Teaching was my favorite part of graduate school, and I signed up for as much training as I could. While I was teaching, or otherwise focussed on students, my role in the project of higher education made sense to me: I was spending years learning about literature so that I could explain it to students who wanted to better themselves. Outside of class, though, the enterprise was murkier. I knew that what really mattered to my professional advancement was academic research. My teaching skills were basically irrelevant. In fact, I’d been warned that teaching was a distraction from the “real work” of writing articles for my peers.
It sometimes seemed as though coursework was a distraction for my students, too. Although earnest and diligent, they were often so immersed in extracurricular activities—charity efforts, musical groups, sports, startups—that they struggled to find time to study. I myself had been part of a startup as an undergrad, and was familiar with the underlying logic driving extracurricular overcommitment: grade inflation, which allowed mediocre students to do less work, also made it harder for excellent ones to distinguish themselves academically. All the incentives, for both teachers and students, encouraged doing less in the classroom and more outside of it.
These contradictions weren’t surprising; they reflected the complex nature of the modern university, in which undergraduate pedagogy is just one of several competing priorities. The implicit theory was, essentially, that students would learn what they could from the university’s top-tier researchers, some of whom were brilliant teachers and some of whom were not. Some classes would be difficult, others laughably easy; grades would be uniformly high; and, in any case, there would be plenty to do outside of class. It would be edifying to be around so many great minds. When learning didn’t happen through instruction, it would happen through osmosis.
Was this theory persuasive? Twenty years ago, it seemed so—but today the gears may no longer be meshing. Student debt has become a generational burden, with tens of millions of people taking on federal loans for degrees. At the same time, college seems to have become dramatically easier, in ways that suggest a diminishment of its core functions. In “The Amateur Hour: A History of College Teaching in America,” the education scholar Jonathan Zimmerman observes that, in 2011, about forty-three per cent of college letter grades were As; in 1988, the figure was thirty-one, and, in 1960, it was fifteen. (In The Atlantic, Rose Horowitch reports that, in 2024, the average G.P.A. of Harvard’s graduating class was 3.8.) Over roughly the same period, “the average amount of studying by people in college declined by almost 50 percent, from 25 to 13 hours per week.” Zimmerman cites a survey finding that, in one semester, half of the respondents from a wide range of institutions hadn’t taken a single course that required writing more than twenty pages, total.
It’s still the case that college graduates tend to be higher earners. And yet the newest data show that people with four-year degrees are now struggling to find jobs. Artificial intelligence, meanwhile, may soon reshape work in a variety of fields; many popular college majors, such as marketing, may prove less valuable than they used to. When A.I. is used by students, it also threatens to turn the classroom into a theatre, in which the act of learning is mimed rather than embraced. Students can have chatbots do their work for them, teachers can give that work inflated grades, and everyone can feel good while learning very little. “There’s a mutually agreed upon mediocrity between the students and the teachers and administrative faculty,” the folk singer Jesse Welles explains, in his song “College.” “You pretend to try, they’ll pretend you earned the grade.” If you want to be a doctor or an engineer, Welles sings, college might be worth it; otherwise, you might “skip the Adderall prescription,” and acquire “a YouTube subscription.”
Since the middle of the past century, the number of Americans in college has increased substantially—in raw numbers, by roughly a factor of five. This development has felt inevitable, driven by the rise of knowledge work and the opening of higher education to once-excluded groups. And yet, in the past decade, enrollment has begun to contract, and that contraction is expected to continue. Demography is a potential factor: a decline in the birth rate, which began around 2007, is predicted to result in fewer high-school seniors. But it also seems possible that more people are concluding that college has changed, and isn’t worth the cost. Universities go out of their way to seem eternal, but higher education is an industry like any other, with its share of ups and downs. If college is a bubble, could it be getting ready to burst?
“Academics used to be the main event at college, surrounded by a lot of sideshows,” Zimmerman told me, when I spoke with him recently. “Now, the sideshows may be the main event.” Even well-intentioned, well-resourced universities have struggled to stop this shift, and Zimmerman finds the roots of the problem in the history of American college teaching. He begins with Mark Hopkins, a professor of philosophy who was the president of Williams College from 1836 to 1872. If Socrates invented the seminar, Hopkins was his American emissary: at a time when education was often conducted through large lectures and by means of rote recitation, he led students in conversations about the meaning of life. “The ideal college is Mark Hopkins on one end of a log and a student on the other,” James A. Garfield, who was one of his students, later said. This idea became a lodestar for educators, Zimmerman writes, who came to understand college teaching as a “charismatic” activity, dependent mainly on the personal vivacity of professors.
There are good reasons for holding this view. A gifted teacher can change your life; a curriculum pre-written by a bureaucracy is unlikely to. As K-12 teachers know, administrative supervision of curricula is fraught with procedural and political perils. Colleges, Zimmerman shows, have navigated this territory by staying true to Garfield’s vision. Decade by decade, they grew larger and more institutionally complex, with flowcharts full of provosts—but, as “more and more of American higher education came under the bureaucratic umbrella, teaching mostly remained outside of it.” Today, administrators micromanage every aspect of college life, but the design and implementation of coursework remains mostly a private matter for individual professors to decide for themselves.