Sept. 3, 2012
This post originally appeared on Higher Ed Watch.
As anyone who has ever attended college probably realizes, the currency of degrees and credentials are credit hours. But few people know where the credit hour comes from. Today the New America Foundation and Education Sector have released Cracking the Credit Hour, a report that covers the credit hour’s history from the days of Andrew Carnegie to the latest “credit hour” regulation. This new policy paper argues that measuring time, rather than learning, is a luxury that students, taxpayers, and the nation can no longer afford. Throughout the week, I will be blogging about Cracking the Credit Hour, beginning with the origins of the credit hour.
Surprisingly, credit hours came about largely because of professor pensions, and not as a way to measure learning. As a trustee of Cornell University, Andrew Carnegie was bothered by low compensation that didn’t allow faculty to save for retirement. Through the Carnegie Foundation for the Advancement of Teaching, he created a free pension system for professors (the legacy lives on today as the not-free TIAA-CREF). Colleges and universities were of course eager to participate in a system that offered free pensions for their faculty. The Foundation decided to leverage this excitement to promote high school reform by requiring that any college wanting to participate in the pension program had to use the “standard unit,” developed earlier to standardize high school courses, for college admission purposes. Colleges had nothing to lose and free pensions to gain, so the time-based standard unit (forever after known as the “Carnegie Unit”) became the de facto standard for determining high school graduation and college admission requirements.
Carnegie’s pension system spurred colleges and universities to convert their own course offerings into time-based units, which were then used to determine faculty workload thresholds to qualify for the new pension program. The credit hour equaled one hour of faculty-student contact time per week over a fifteen-week semester. Faculty members who taught 12 credit units a semester, a full load, qualified for full-time pension benefits. The “Carnegie Unit” still survives today at the secondary level and its “credit hour” offspring has become the fundamental building block of courses and degree programs at the postsecondary level.
Unfortunately, the credit hour has become a proxy for student learning. The Carnegie Foundation was quite clear that this was not the intent of a time-based unit. In its 1906 annual report "it stated explicitly that in the counting the fundamental criterion was the amount of time spent on a subject, not the results attained.”
Why does it matter where the credit hour came from? College degrees that are built upon credit hours are largely disconnected from evidence of learning. Tomorrow I will share thoughts about why that’s a problem and what he federal government can take to help us move from seat time to learning.
(You can find the report on the Cracking the Credit hour page here)