Image by Lou FCD via Flickr

Estimates of the rate of return to a college education — apart from their inherent cynicism, which attempts to reduce the value of education to a dollars-and-cents level — are rife with problems, but essentially boil down to a fractional calculation. That is, the direct expenses of going to college factored into the population-based salary increment college graduates receive as a whole. Most methods also include the work and wages lost, or the opportunity cost of going to college. They typically assume that an entire 40-hour work week is supplanted by studying for college.

A new working paper at the National Bureau of Economic Research (NBER) by Philip Babcock and Mindy Marks entitled, “The Falling Time Cost of College: Evidence from Half a Century of Time Use Data,” shows that part of the denominator — the opportunity cost of going to college — has been falling since 1961, with decreases across the board.

The authors studied full-time students at four-year colleges as seen through surveys done in 1961, 1979-1987, 1981, and 2003-2005. The found that the decreases in time spent on academic work fell from 40 hours per week in 1961 to 27 hours per week in 2004, and wasn’t related to race, gender, ability, family background, major, employment, or type of college.

It’s a dramatic change in the work students put in during college, even at a lower level of measured contribution. In 1961, 67% of full-time students at four-year colleges studied more than 20 hours per week. In 2003, 20% spent more than 20 hours per week studying, and in 2004, 13% spent more than 20 hours per week studying.

The authors also checked to see if the effects were caused by students just taking fewer courses. Short answer? No, it wasn’t. Surveys were primarily of on-schedule students at four-year institutions, so these are not the students contributing to longer overall times to degree acquisition through part-time studies, leaving and returning, or lighter workloads. The surveys covered students on-schedule to graduate in four years.

Because different survey instruments were concatenated, the authors had to adjust for framing effects to tease out whether phrasing of questions about study habits were affecting the answers. They used a clever technique, recruiting current students to answer the same questions randomly, looking at the differences, and adjusting the real data by the amount of difference between formats found during the exercise. Framing was a moderately significant factor in the responses, so it was good to tease it out.

Now, the key question: Why is this happening?

It’s tempting to think that information technologies might explain the changes. After all, fewer trips to the library, the ability to locate references with a click, and other improvements in learning modalities could be subtracting scut work from education — eliminating the time-consuming friction of inefficiency.

However, the largest decline in study time appears to have occurred in the 1961-1981 period, hardly a time period associated with a robust consumer information technology infrastructure. In fact, if anything, this time period suggests that social change might be a contributing factor to the changes in college lifestyles.

The authors offer some other tentative explanations (universities adapted to students so became more efficient places of study; costs and payback of college altered how students divvied up their time), but can’t explain their data satisfactorily. Others will have to do that.

However, a few implications are clear:

  1. For students who are on-schedule to graduate in four years (a declining percentage), college has lower opportunity costs than it used to, and calculations shouldn’t assume a 40-hour work week is supplanted by studying
  2. For universities, students are contributing less academic labor than they used to
  3. For administrators, calculations of the lifetime value of a university education just changed — opportunity costs are lower for students who stay on-schedule, so payback estimates should improve

Another explanation — purely speculative on my part — is that as specialized knowledge has become more important, general education has been somewhat thinned in order to move advanced academics (both curriculum and faculty) farther into master’s and PhD programs. There would only be incentives all around for this to happen — universities keep students longer in aggregate, students become even more viable in the job market, tuition income increases, faculty feels more intrinsically validated. Some evidence of this shift would be what this study found.

Namely, everyone’s pacing themselves through undergrad for the 2-year sprint of grad school.

Reblog this post [with Zemanta]
Kent Anderson

Kent Anderson

Kent Anderson is the CEO of RedLink and RedLink Network, a past-President of SSP, and the founder of the Scholarly Kitchen. He has worked as Publisher at AAAS/Science, CEO/Publisher of JBJS, Inc., a publishing executive at the Massachusetts Medical Society, Publishing Director of the New England Journal of Medicine, and Director of Medical Journals at the American Academy of Pediatrics. Opinions on social media or blogs are his own.

View All Posts by Kent Anderson


2 Thoughts on "The First Four Years of College: Why Are Students Spending Less Time Studying?"

Might be interesting to see the results of this study plotted against the rate of grade inflation that’s occurred over the same period.

  • David Crotty
  • May 11, 2010, 9:36 AM

Comments are closed.