What's in the Department's Competency-Based Experimental Site Notice?

For those who have been eagerly awaiting the details of the US Department of Education’s experimental sites on competency-based education (including prior learning assessments), the wait is over. The Department of Education has posted a preview of the official notice for schools that wish to apply to be experimental sites.

First, a quick refresher on the Experimental Sites Initiative. (For those who want to get straight to the nitty-gritty of the Department’s new notice, see below the second line.)

What is the Experimental Sites Initiative?

The Experimental Sites Initiative is a tool Congress has given the Department of Education to seed and refine policy ideas that relate to the important, expensive, and complicated area of federal financial aid. A part of the Higher Education Act since the 1992 reauthorization, it has evolved to allow the Department to waive regulatory and/or statutory financial aid requirements for small, volunteer groups of institutions to figure out what works, for whom it works, and under what conditions it works. Congress can then use the results of these experiments to inform future policymaking. Experimental sites give Congress a way to see how policies might work before they are implemented writ large, hopefully mitigating unintended consequences. (Evidence-based policy making?! Crazy talk.)

This sounds powerful. Why haven’t I heard of this before?

There is, indeed, tremendous potential for experimental sites, but they have received scant attention over the years. Part of the obscurity can be attributed to the original intent: streamlining and reducing the regulatory burden associated with the administration of federal financial aid. The original "experiments" were a bit of a misnomer—there were no clear evaluation measures, there was no specified time frame, and there were no established reporting requirements for institutions or the Department. Rather than test the effects of financial aid changes on student outcomes, they served (or, at least, many in Congress believed they served) largely as waivers of laws and regulations for a handful of schools. They did not, for the most part, inform broader policymaking.

This changed in 2008 when, as part of the Higher Education Opportunity Act, Congress required the Department to assess whether the experiments were successful in benefiting students and improving the delivery of financial aid. Since then, the Department has run a few experiments that look to inform broader policy (e.g. allowing institutions to lower students’ unsubsidized loans under certain conditions, allowing the use of Pell Grants to pay for short-term vocational training). But experimental sites (and CBE) got a real PR boost last fall, when President Obama said on his college bus tour touting his higher education agenda: "We want to [...] encourage more colleges to embrace innovative new ways to prepare our students for a 21st century economy and maintain a high level of quality without breaking the bank." How would he realize this lofty rhetoric? In part, by dusting off the underused and experimental sites authority.

Back to the eye-glazing (but important and forward-thinking, if you can get through the FederalSpeak) 43-page Department document detailing four new experiments. Once the official announcement is made in the Federal Register this week, institutions will have 60 calendar days to apply to participate in one or more of the experiments. Institutions must have a strong track record of administering federal financial aid and must agree to report on, as well as evaluate, their experiments.

There’s a lot in there, but here are some highlights for each of the three CBE/PLA experiments (leaving out the fourth experiment on work-study):

1. Prior Learning Assessment experiment—Too often, colleges treat students like blank slates, despite the fact that some have college-level learning acquired outside of the classroom (e.g. in the workplace or in the military). If students could demonstrate this knowledge with Prior Learning Assessments (PLAs), they may be able to bypass classes whose material they’ve already mastered, saving valuable time and money and increasing their odds of completing. Currently, however, federal financial aid does not pay for PLAs or the time students spend preparing for PLAs. This experiment would allow institutions to include “reasonable costs” associated with PLAs (e.g. test fees) in a student’s official Cost of Attendance, which is used in determining how much federal financial aid a student needs. This experiment would also allow students engaged in significant preparation for PLA—like preparing materials for a portfolio assessment, but NOT studying for or taking a test—to have up to three credits count towards their Pell Grant enrollment status. This experiment hopes to ask and answer a number of questions, including: What do assessments cost? How does PLA affect borrowing or completion? How do schools determine which PLAs to accept?

2. Limited Direct Assessment experimentLast year, the Department used another largely forgotten authority, direct assessment, to allow institutions to use financial aid to pay for learning rather than time (as measured in credit hours). Although this was a huge step for the federal government, policy dictates that programs be either credit hour-based or direct assessment-based for the purposes of distributing financial aid. But students may find that a mix of approaches works best for them. They may want a couple of seat time classes and a few classes that let them go at their own pace. This experiment would allow students to mix and match. It would also allow developmental education programs to qualify for direct assessment, which is prohibited under current regulations. (Note that this experiment doesn’t solve confusion that institutions and accreditors have had with direct assessment. But there may be good news on that front, too. Under Secretary Ted Mitchell said at a recent meeting of the Competency-Based Education Network that the Department would soon issue guidance on direct assessment.)

3. Competency-Based Education experiment—Here’s where we see the most innovative thinking from the Department. Below are a few highlights, some of which are very in the weeds.

Allowing direct and indirect costs to be paid differently. Schools divide a student’s financial aid into direct costs (those paid directly to the school for tuition, fees, etc.) and indirect costs (often not paid directly to the school, such as living expenses). In this experiment, schools can, but don’t have to, disburse these types of costs differently. If a school wants to release financial aid only after students complete a certain number of competencies, it can do that for the costs it directly controls. But it must disburse indirect costs on a regular schedule, since students’ rent, food, and other bills need to be paid on time. For both types of costs, however, the Department is offering tremendous flexibility in how and how often disbursements are made. (For those really wanting to get in the weeds—since disbursements are smaller, Return to Title IV will be waived.)

Allowing for a more CBE-appropriate definition of Satisfactory Academic Progress. One requirement for students to receive financial aid is that they make "satisfactory academic progress" (SAP). This is to ensure that the federal financial aid program encourages completion and that the federal government doesn’t subsidize someone’s college education indefinitely. SAP is essentially calculated by looking at credits completed versus credits attempted, a problematic requirement for self-paced CBE programs where instead of grades, students have either mastered or not mastered the material. In some programs, students fail multiple times before they demonstrate mastery. This experiment allows SAP to be calculated by looking at credits completed over a longer period of time (an academic year), allowing students to move at their own pace without being penalized as they try to move forward more quickly.

Allowing flexibility in what counts as instructional time. The experiment requires any CBE program to have an academic year of at least 30 "weeks of instructional time." This might cause some CBE hearts to stop beating momentarily, since "instructional time" is often associated with, well, instruction. A narrow reading of this could mean that we’re back to regularly-scheduled seat time (aka instruction), rather than learning. But there are many ways that institutions can support student learning, rather than simply transmitting knowledge through direct instruction. Western Governors University, one of the institutions that President Obama held up when announcing his plans to advance CBE through experimental sites, offers no direct instruction; rather it uses coaches, subject-matter experts, and others who direct students to appropriate learning resources critical to student success. According to the Department's own language, it seems this experiment is going with the broader interpretation of instructional time:

Consistent with existing regulations, for purposes of this experiment, a week of instructional time is any seven-day period in which the institution makes available [emphasis added] to the students enrolled in the CBE program, instructional materials and faculty support so that a student could be engaged in an educational activity.… Also consistent with existing regulations, for the purpose of this experiment, an educational activity includes, but is not limited to, regularly scheduled learning sessions, faculty-guided independent study, consultations with a faculty mentor, development of academic action plans covering the competencies identified by the institution, or, in combination with any of the foregoing, assessments [emphasis added].

If a broader reading is correct, then as long as, 1) students ultimately achieve the required competencies over the academic year, and 2) institutions provide students opportunities to engage every week, it doesn’t matter if students receive "instruction" every week. What confuses this interpretation, however, is a subsequent statement in the notice that "regular and substantive interaction between students and instructors will be required." Are "instructors" only people who provide instruction, or are they those who fill the other "educational activity" functions? I think it’s the broader case, but since the terms "regular and substantive interaction" and "instruction" have narrow and fraught regulatory history, the Department should provide more clarity lest it be interpreted narrowly. (Pro tip to the Department from a former fed: You can't explain enough. What is crystal clear to you on the inside is filtered through real and perceived experiences of a compliance-oriented, narrowly-focused, bureaucracy on the outside. You have to constantly reaffirm your commitment to thinking and acting creatively throughout the experimental site process.)

Although it may have taken a few decades to get to a place where experimental sites are being used to seed innovation, it looks like we might finally be there. Kudos to Congress for giving the Department the authority to innovate, and kudos to the Department for resisting the urge to just tinker around the edges. If executed well, these experiments have tremendous potential to lead to break-through ways of delivering high-quality education. We're looking forward to the next steps!"


Amy Laitinen is director for higher education with the Education Policy program at New America. She previously served as a policy advisor on higher education at both the U.S. Department of Education and the White House.

Lindsey Tepe is a senior policy analyst with the Education Policy program at New America. She is a member of the Learning Technologies project and PreK-12 team, where she focuses primarily on innovation and new technologies in public schools.