Closing the Evidence Gap
Abstract
At a time when too few students are graduating from college and reaping the financial benefits that a higher education bestows, despite increasing numbers of those students taking on debt to enroll in college, it is absolutely essential that lawmakers engage in evidence-based policymaking. Members of Congress must support greater data transparency, research, and evaluation so that they can finally start to close the gap between what we know works and what does not to improve students’ chances of success, and so we know which students need help, in which ways. Billions of taxpayer dollars and millions of students’ futures are on the line.
A concerted national effort to improve college is needed. It will require political will, a rethinking of the federal government’s higher education programs so they are centered on finding and expanding the strategies that work for students rather than treating evidence as an afterthought. This report seeks to explain the impetus for change, as well as identify solutions for policymakers in the Department of Education and on Capitol Hill.
Acknowledgments
New America would like to thank Arnold Ventures for its support of this project, and the Bill & Melinda Gates Foundation for its generous support of our work. The views expressed in this report are those of its author and do not necessarily represent the views of foundations, their officers, or employees.
Downloads
Introduction
Every year, Congress invests nearly a billion dollars in TRIO programs intended to support college access and success for low-income students and students of color, filling a critical need for a population far less likely than their high-income peers to enroll in or graduate from college. But are the programs working? How do they promote student success? And are students of color and low-income students going to, and finishing, college because of these investments? Supporters would say yes; in fact, they have embraced the motto “TRIO Works.” They have pasted it onto buttons that they wear when they champion the programs on Capitol Hill, hashtagged it on social media, and printed it on merchandise.
But the truth is that we do not know how well the TRIO programs work, and will not know anytime soon. That is because, instead of using evidence and research to ensure that the programs are effective and are serving students well, the U.S. Department of Education is forbidden by law from undertaking the kind of rigorous evaluation that is needed to affirm their effectiveness.
The story behind that prohibition starts in 2007, when the Bush administration proposed eliminating TRIO’s Upward Bound program because a national evaluation of the program several years earlier had found disappointing results. The Council for Opportunity in Education (COE), which lobbies on behalf of TRIO grantees, launched an aggressive campaign that it dubbed “Operation Rolling Thunder” to convince Congress that it was “unethical, even immoral” to require the grantees to participate in a randomized controlled trial, in which not all students would be eligible for Upward Bound services.1 U.S. Department of Education officials fought back, saying that these types of evaluations are invaluable in showing policymakers whether the programs the agency runs maximize investments in them.2
Using incendiary language comparing the Upward Bound study to the infamous and horrifyingly unethical Tuskegee experiment,3 in which federal scientists withheld life-saving treatments from Black men suffering from syphilis to study the progress of the disease, COE won the day. Lawmakers added a rider to the Department’s annual appropriations bill that barred it from continuing to fund the ongoing national evaluation.4 But the lobbying group’s real victory came when Congress added a provision to the Higher Education Act in 2008 banning the Department from ever conducting randomized controlled trials of the program, or of any TRIO program, further protecting the status quo for existing grantees.5
Where there is a program tied to federal funding, there are, of course, stakeholders with significant investments in the future of that program. But this saga shows how stakeholders who benefit financially from federal programs can make it difficult for policymakers to assess whether federal programs are working effectively, potentially undermining lawmakers’ ability to ensure that students are truly the ones benefiting from the program.
The U.S. Department of Education is forbidden by law from undertaking the kind of rigorous evaluation that is needed to affirm TRIO programs' effectiveness.
This is not just a political matter, and it is certainly not limited to the TRIO programs. For too long, lawmakers have forged higher education policy without knowing what works and what does not. Much of the federal money invested in support for institutions and other grantees is spent on efforts rooted in little more than a broad, vague goal of supporting student success, whether or not they actually do so. Across the Higher Education Act, from TRIO and GEAR UP to funds dedicated to minority-serving institutions to efforts to increase innovation, policymakers have rarely asked—and researchers have never answered—the question of how best to improve outcomes for low-income students.
At a time when too few students are graduating from college and reaping the financial benefits that a higher education bestows, despite increasing numbers of those students taking on debt to enroll in college, it is absolutely essential that lawmakers engage in evidence-based policymaking. Members of Congress must support greater data transparency, research, and evaluation so that they can finally start to close the gap between what we know works and what does not to improve students’ chances of success, and so we know which students need help, in which ways. Billions of taxpayer dollars and millions of students’ futures are on the line.
A concerted national effort to improve college is needed. It will require political will, a rethinking of the federal government’s higher education programs so they are centered on finding and expanding the strategies that work for students rather than treating evidence as an afterthought. This report seeks to explain the impetus for change, as well as identify solutions for policymakers in the Department of Education and on Capitol Hill.
The Need to Improve College Outcomes
Earning a college degree gives students of all backgrounds the best chance of finding a job, avoiding unemployment, and increasing their lifetime earnings considerably.6 But the American higher education system is often little more than a sieve, sifting students who do enroll in higher education out of college before they earn a degree, especially for the most vulnerable students.
Low-income students are less likely to go to college, and if they do enroll, are less likely to go to the colleges that will give them the greatest chance of success. According to the Equality of Opportunity Project, children who come from the wealthiest 1 percent of families are 77 times more likely to attend an Ivy League College than children who come from the poorest 20 percent of families.7 In fact, other research has shown that low-income and underserved students are more likely to enroll in programs that have labor-market returns too low to justify how much they cost.8 For-profit colleges disproportionately enroll low-income and Black students, while charging more and reporting lower earnings than public-college programs in the same field of study.9 And across sectors, the Center on Education and Skills at New America has found significant gender gaps both in labor market outcomes and costs for training programs.10
Moreover, once they enroll in higher education, too many students drop out, leaving school empty-handed and with debt loads they cannot afford to pay back.11 Low-income students and students of color are also less likely to persist and graduate with a degree. For instance, nearly two out of three associate and bachelor’s degrees are awarded to white students, according to a Department of Education report.12 Even at community colleges, where higher education programs are most affordable, fewer than one in three students graduates within one-and-a-half times the length of the program.13
Low-income students are less likely to go to college, and if they do enroll, are less likely to go to the colleges that will give them the greatest chance of success.
There are bright spots, as some colleges have tackled these problems systematically, using a data-driven and evidence-informed approach, with demonstrated results. The City University of New York (CUNY) in 2007 instituted the Accelerated Study in Associate Programs (ASAP) intervention, a comprehensive set of supports that include financial assistance, advising and career counseling, tutoring, training in study skills, and reformed remedial courses.14 A rigorous evaluation of the program by MDRC found that it increased both course taking and retention. Within three years, ASAP nearly doubled graduation rates, from a shockingly low 22 percent to 40 percent, and increased transfer rates to a four-year school from 17 percent to 25 percent.15 The program was scaled up to include community colleges in Ohio, with a rigorous evaluation showing similar, positive results.16
Other interventions that have been rigorously studied have also found promising results.17 Scholarships tied to academic goals—like maintaining a C average—increased college graduation rates at a couple of community colleges in Ohio by 21 percent after two years, and increased the number of credits students earned in pilot programs conducted across six states.18 Informational outreach via text message has been found to increase retention rates by nearly 14 percent.19 And colleges that offered coaching on how to juggle work, school, and life, and how to be successful in college found that their students were over 5 percent more likely to stay enrolled six months and even a year later, and they were 4 percent more likely to graduate from their programs—a lasting effect that few other interventions have found.20
But the research on what works to improve students’ odds of success in higher education remains limited. Only a handful of organizations and researchers even conduct such evaluations. Once studies have been conducted, the distribution of that research to practitioners at colleges and universities is limited and haphazard, without an explicit effort by the Education Department. And there has been a lack of coordination by the federal government to solve the research problems or tackle the challenges of increasing college completion rates and other problems.
Moreover, even when promising interventions have been identified, colleges have few incentives to apply that research. Many of these interventions require significant upfront costs, and the benefits and savings to campuses, which are often realized through increased retention and graduation rates, may take a long time to achieve. Federal policy requires only that colleges avoid extremely high default rates and comply with federal rules and regulations, few of which are based on outcomes. Federal dollars to support improvement and quality assurance are limited. States with performance-based funding models may starve the institutions most in need of improvement of the funding they might need to implement new practices. As a result, practitioners often lack the resources—or incentive—they need to implement such practices.
Still, the time is ripe for a renewal of higher education’s commitment to serving students, particularly those most at risk of dropping out of college. While the public continues to have faith in local colleges that serve their communities, its trust in higher education as a whole is at historic lows.21 Practitioners in the field are searching for new and smarter ways to prove the value of higher education.
Even when promising interventions have been identified, colleges have few incentives to apply that research.
Meanwhile, there is growing interest on Capitol Hill and in the states on improvement and evaluation in policy. In 2016, Sen. Patty Murray (D-WA) and former Rep. Paul Ryan (R-WI) convened the Commission on Evidence-Based Policymaking. The resulting landmark report issued in 2017 laid out dozens of strategies for federal agencies to use, some of which were signed into law earlier this year.22 Recent legislation, the Fund for Innovation and Success in Higher Education (FINISH) Act introduced by Sens. Young (R-IN), Bennet (D-CO), and Scott (R-SC), would invest in evidence-based practices and increase the Education Department’s evaluation authority.23 Then-chair of the House education committee Virginia Foxx introduced the PROSPER Act, a comprehensive bill to reauthorize the Higher Education Act that would require the Education Department to rigorously evaluate projects when it grants regulatory flexibility to colleges,24 eliminate a ban on evaluating TRIO grantees’ efforts and dedicate 10 percent of the programs’ funding to an evidence-based competition, and increase the Department’s authority to conduct “pay-for-success” projects to innovate and test new ideas within its existing programs.25
These lawmakers’ embrace of evidence-based policymaking is encouraging and long overdue. And if they follow through, using research not to cut spending but to reinvest resources in what works, we could see significant advances in achieving greater student success.
Citations
- Kelly Field, “Are the Right Students ‘Upward Bound?’” Chronicle of Higher Education, August 17, 2007, source; and Doug Lederman, “Little Change for Upward Bound,” Inside Higher Ed, September 25, 2006, source.
- Kelly Field, “Are the Right Students ‘Upward Bound?’” Chronicle of Higher Education, August 17, 2007, source.
- Ibid.
- Kelly Field, “Education Department Agrees to End Controversial Upward-Bound Study,” Chronicle of Higher Education, February 25, 2008, source; Kelly Field, “Senate Approves Measure Blocking Evaluation of Upward Bound,” Chronicle of Higher Education, October 19, 2007, source; and Kelly Field, “Are the Right Students ‘Upward Bound?’” Chronicle of Higher Education, August 17, 2007, source.
- Higher Education Opportunity Act, §428H, enacted August 14, 2008.
- Jennifer Ma, Matea Pender, and Meredith Welch, Education Pays 2016: The Benefits of Higher Education for Individuals and Society (New York: The College Board, 2016), source.
- Raj Chetty, John N. Friedman, Emmanuel Saez, Nicholas Turner, and Danny Yagan, “Mobility Report Cards: The Role of Colleges in Intergenerational Mobility,” Opportunity Insights, July 2017, source. The Equality of Opportunity Project is now called Opportunity Insights.
- Ibid.
- “The State of For-Profit Colleges,” Center for Responsible Lending (website), January 29, 2019, source; and U.S. Department of Education, “Fact Sheet: Department of Education Announces Release of New Program-Level Gainful Employment Data,” November 16, 2017, source.
- Lul Tesfai, Kim Dancy, and Mary Alice McCarthy, Paying More and Getting Less: How Nondegree Credentials Reflect Labor Market Inequality Between Men and Women (Washington, DC: New America, September 2018), source.
- Robert Kelchen, “A Look at Pell Grant Recipients’ Graduation Rates,” Brookings (website), October 25, 2017, source; and Jean Johnson, Jon Rochkind, Amber N. Ott, and Samantha DuPont, With Their Whole Lives Ahead of Them: Myths and Realities About Why So Many Students Fail to Finish College (Brooklyn, NY: Public Agenda, December 9, 2009), source.
- Advancing Diversity and Inclusion in Higher Education (Washington, DC: U.S. Department of Education, November 2016), source.
- U.S. Department of Education, National Center for Education Statistics, “Table 362.20: Graduation Rate from First Institution Attended within 150 Percent of Normal Time,” prepared December 2017, source.
- City University of New York (website), “About: What is ASAP?” source.
- Susan Scrivener, Michael J. Weiss, Alyssa Ratledge, Timothy Rudd, Colleen Sommo, and Hannah Fresques, Doubling Graduation Rates: Three-Year Effects of CUNY’s Accelerated Study in Associate Programs (ASAP) for Developmental Education Students (New York: MDRC, February 2015), source.
- MDRC, “Ohio Programs Based on CUNY’s Accelerated Study in Associate Programs (ASAP) More than Double Graduation Rates,” press release, December 2018, source.
- James Kvaal and John Bridgeland, Moneyball for Higher Education: How Federal Leaders Can Use Data and Evidence to Improve Student Outcomes (Washington, DC: Results for America, January 2018), source.
- Reshma Patel, Lashawn Richburg-Hayes, Elijah de la Campa, and Timothy Rudd. “Performance-Based Scholarships: What Have We Learned? Interim Findings from the PBS Demonstration,” MDRC policy brief, August 2013, source.
- Benjamin L. Castleman and Lindsay C. Page, “Freshman Year Financial Aid Nudges: An Experiment to Increase FAFSA Renewal and College Persistence,” Journal of Human Resources 51, no. 2 (June 2014): 389–415, source.
- Eric P. Bettinger and Rachel B. Baker, “The Effects of Student Coaching: An Evaluation of a Randomized Experiment in Student Advising,” Educational Evaluation and Policy Analysis 36, no. 1 (March 2014): 3–19, source.
- Eric Kelderman, “‘Higher Education’ Isn’t So Popular, Poll Finds, but Local Colleges Get Lots of Love,” Chronicle of Higher Education, May 21, 2018, source. For more on the issue of trust in higher education, see Rachel Fishman, Ernest Ezeugo, and Sophie Nguyen, Varying Degrees 2018:New America's Annual Survey on Higher Education (Washington, DC: New America, May 2018), source.
- The Promise of Evidence-Based Policymaking (Washington, DC: Commission on Evidence-Based Policymaking, September 7, 2017), source; and Foundations for Evidence-Based Policymaking Act of 2017, P.L. 115-435, enacted January 14, 2019.
- The Fund for Innovation and Success in Higher Education (FINISH) Act. S. 1059.
- For more on the need for evaluation in federal experiments, see Clare McCann, Amy Laitinen, and Andrew Feldman, Putting the Experiment Back in the Experimental Sites Initiative (Washington, DC: New America, January 23, 2018), source.
- The Promoting Real Opportunity, Success, and Prosperity through Education Reform (PROSPER) Act. H.R. 4508; introduced in the 115th Congress.
Stoking Competition: Background on Grant Programs in Higher Education
The federal government invests over $130 billion annually in higher education through Pell Grants, student loans, tax benefits for students and their families, and other forms of student aid. But relative to those benefits for students, it provides relatively little aid directly to institutions, and even less to colleges to help support implementing and evaluating evidence-based interventions.
Two of the federal government’s largest investments in college, the TRIO and GEAR UP programs, are designed to motivate and assist low-income students and students of color in enrolling in and completing college, with more than $1.4 billion in annual funds. Together, these programs form a patchwork of hundreds of individual grantees working toward similar goals, but often in different ways, with few opportunities to learn and share which strategies are most effective and for which populations.
The Education Department also provides funding directly to minority-serving colleges and universities through an array of formula and competitive-grant programs. Those programs—all of which allow colleges to use the funding to serve students directly—participate in maintenance activities, and even build their endowments, are designed to increase institutional capacity more than necessarily improve student outcomes. These include funding for historically Black colleges and universities; predominantly Black institutions; Hispanic-serving institutions; tribal colleges or universities; Native American-serving nontribal institutions; Alaska Native-serving institutions or Native Hawaiian-serving institutions; and Asian American and Native American Pacific Islander-serving institutions.
Finally, Congress funded two years of the Obama administration’s “First in the World” program—an evidence-based, tiered grant competition focused on improving college completion. It offered grants to seed innovations, replicate promising practices, and evaluate their effectiveness, with proven interventions eligible for the largest grants. Unfortunately, Congress stopped funding the program after only two competitions, and it was never codified into the Higher Education Act, so the Department’s efforts to promote evidence-based innovation and evaluation in higher education through the program have gone largely dormant.
College Access Across the Nation
TRIO
TRIO, included in the original Higher Education Act in 1965, represents the federal government’s oldest investment in ensuring disadvantaged students are prepared for, and have access to, higher education. As the name suggests, the TRIO umbrella originally covered three separate programs: Upward Bound (which was formed the year before the HEA was signed into law, in 1964), Talent Search, and Student Support Services (added in 1968). Today, it includes seven programs, each with different but overlapping purposes, audiences, and funding levels.
Talent Search: Primarily serves disadvantaged students in high school and provides tutoring and financial counseling to help them enroll in college after graduating. It serves over 300,000 students each year through more than 470 grantees, close to 20 percent of which are in nonprofit or other non-college organizations. A relatively low-touch program, the federal government spends only about $511 per student receiving services through the program.
Upward Bound: Works in high schools to provide many of the same services as Talent Search, but also may offer on-campus residential programs over the summer or work-study jobs. It serves about 70,000 students each year, and the federal cost per student is almost 10 times that of Talent Search, at $5,014 per student receiving services.
Veterans Upward Bound: Although this program offers many of the same services as Upward Bound (like tutoring, academic counseling, and financial advising), it does so for the narrower population of veterans seeking to enroll in higher education. Annually, grantees provide services to about 8,150 veterans, at a price to the federal government of about $2,254 per veteran served.
Upward Bound Math and Science: Establishes centers around math and science to encourage high school students to pursue higher education in STEM programs. It is much smaller than the Upward Bound program, serving about 13,000 students per year, but provides services at a comparable price, $5,134 per student served.
Educational Opportunity Centers: This is perhaps the lowest-touch TRIO program, designed to provide college counseling and other information about enrolling in higher education to those in a community rather than serving a particular cohort of students. While its reach is large, serving nearly 200,000 students each year, its intensity is low and costs the federal government only about $267 per student.
Student Support Services: Serves students already enrolled in college to increase their chances of returning to school and graduating, through tutoring, counseling, financial assistance (provided the college ensures a one-third match of any federal dollars spent on grants to students), and other services and programs. While most grants across all TRIO programs go to colleges, all Student Support Services grants do. The program serves more than 200,000 students per year at an average federal price of $1,590 per student.
McNair: Provides undergraduate students with research opportunities, counseling, and tutoring to help them prepare for doctoral programs, as the only TRIO program focused exclusively on preparing students for graduate studies. It is the smallest TRIO program, serving only around 5,200 students each year, but in part due to expensive services and travel costs, it is also the most expensive per student, at $9,133 per participant, on average.
GEAR UP
Whereas TRIO programs typically serve students for only a year before identifying a new round of students, GEAR UP grantees typically identify a cohort of students in middle school and continue to serve them for six years, sometimes also offering a seventh year of grant services during their first year of college. Grantees provide college preparation services like tutoring and counseling, and financial aid assistance including, in some cases, college scholarships. The Department awards GEAR UP dollars to two types of grantees: partnerships of colleges, school districts, and other stakeholders, which must provide college preparation services to whole cohorts students while they are in school; and states, which unlike partnership grantees, are required to provide scholarships unless they receive waivers (as more than half of state grantees had in 2005).26 GEAR UP state grantees also have authority to use their funds to provide technical assistance to partnership grantees, offer professional development, and disseminate research and best practices to grantees and schools, among other things.
In short, the federal government has a number of avenues that policymakers can use to test practices to see if they improve students’ chances of success and encourage colleges and their partners to engage with research-driven reform. In fact, the Obama administration took steps to incorporate evidence as small, optional components of each of these programs.27 Despite a mixed response from some grantees, it succeeded in persuading the vast majority of them to adopt evidence-based protocols. According to a Pell Institute report that cited Department of Education data, in the 2015 Student Support Services competition, 95 percent of applicants addressed the evidence priorities the Department included (which asked grantees to propose interventions backed by research at varying levels of rigor), and 77 percent won the full six points possible under those priorities.28
TRIO Programs
With more than 50 years under the belts of several of the TRIO programs, the federal government still does not have a firm grasp of how well its grantees are performing. A study of Talent Search published in 2006 found that participating students across three states were more likely to enroll in college;29 but there is only a thin body of research, and evidence is mixed for the effectiveness of the grants, making it difficult to contextualize what we do know about grantees’ performance. A study of Upward Bound programs conducted over the course of nearly 20 years found that the average participant was no more likely to enroll in college than those in a control group.30 And a 20-year study of Student Support Services released in 2010 found no statistically significant differences in persistence rates, transfer rates to four-year colleges, or completion rates between participating students and non-participating students.31
Despite these disappointing outcomes, lobbyists for the TRIO programs have been particularly vehement in opposing efforts by policymakers to try to steer grant dollars toward the best proposals rather than simply to incumbent grantees. These lobbyists have been particularly adamant in their defense of prior experience points, statutorily mandated bonus points given to existing grantees that make it difficult for new participants to obtain grants. Even minimal efforts by policymakers to introduce evidence-based priorities (i.e., bonus points for applications that cite interventions backed by rigorous research) in the TRIO competitions are portrayed by some in the community as attempts to purge the program of long-time participants.
For instance, COE officials accused the Department of violating the Higher Education Act when it proposed in 2014 to introduce competitive preference points that they said would reduce “grantees’ prior experience points from 13 percent of their score…to just 11.7 percent.”32 The Department moved ahead with its proposal, and the end results showed only modest differences. When evidence-based priorities were in effect from 2015 to 2017, more than nine in 10 prior grantees that applied for new funding won a grant, while fewer than 30 percent of new applicants did, according to the Department.33
By many accounts, the TRIO lobby, as a part of the powerful higher education industry that wields influence with many lawmakers, is a significant obstacle to improving and expanding the use of evidence-based policies in higher education. But the fact that the program, which costs taxpayers nearly $1 billion every year, has shown lackluster results should raise eyebrows and give voice to the need for change. Without the adoption of practices that have been proven successful, college access and success rates will continue to stagnate.
The TRIO lobby, as a part of the powerful higher education industry that wields influence with many lawmakers, is a significant obstacle to improving and expanding the use of evidence-based policies in higher education.
GEAR UP
The lawmakers who created the GEAR UP program in 1998 based it on the “I Have a Dream” Foundation, led by Eugene Lang, a businessman in New York who promised full college tuition to a class of sixth-graders at his former elementary school in East Harlem. Ninety percent of the students in that program completed high school, and 60 percent enrolled in college.34 Congress developed the program largely in an effort to respond to some of the challenges faced by TRIO Talent Search grantees. For instance, GEAR UP serves entire grades of students, rather than only the students a grantee manages to recruit, as in Talent Search; and it allows for school-level activities.35 And unlike Talent Search, the GEAR UP program does not include prior experience points to boost the scores of existing grantees.
While few evaluations have been conducted in recent years,36 some older research suggests a positive impact for students while they were participating in the programs, albeit few results that persist to the college level. A 2001 Education Department study found that GEAR UP students, particularly Black students, took more rigorous courses in GEAR UP middle schools than in non-GEAR UP schools, and had a better understanding of opportunities to pursue a higher education. Another study found that GEAR UP participants in California boosted their math and reading scores on state standardized tests considerably.37 A third study, by the testing company ACT, found slightly higher test scores and improvement among GEAR UP participants.38
Still, while the National Council for Community and Education Partnerships, the lobbying association representing GEAR UP grantees, is open to engaging in rigorous evaluation of its effectiveness, and helping to evaluate grantees across about a dozen states,39 the Education Department has found it difficult to conduct a thorough, quantitative analysis nationally, particularly for state grantees, which engage in a wide range of activities and which may look very different from one another. Nor is the GEAR UP program centered around directing grantees to implement practices already proven to improve students’ access to a high-quality higher education. Like the TRIO program, it has been retrofitted to some extent with Department priorities related to incorporating evidence, but only minimally so.
Minority-Serving Institution Programs
Congress has also created discretionary grant competitions, alongside some formula grants, to provide funds to minority-serving and other institutions that primarily enroll students who have been traditionally underserved. The grants, which are provided through Titles III and V of the Higher Education Act, offer funds to historically Black colleges and universities; each of the categories of minority-serving institutions (MSI) recognized in law; and, through the federal Strengthening Institutions Program, institutions that serve large shares of low-income students. These programs are designed similarly and allow institutions to use the funds for a very broad set of purposes. Those purposes include everything from the construction and maintenance of buildings, to the acquisition of technology and equipment, to tutoring and student support services, to including the dollars in endowment funds.40 All told, Congress spent more than $687 million this year on the MSI programs, funding a diverse set of institutional support projects.41
Beginning in fiscal year 2012, the Education Department experimented with a new approach to encourage institutions to use some of their resources on evidence-based interventions designed to support student success. That competition invited applicants to submit proposals based on rigorous research in exchange for additional points for their applications, and ultimately funded a handful of research-backed projects proposed by institutions serving low-income students and students of color; 137 of 151 applicants addressed the priority.42 The Department has continued to conduct competitions that offer priority points for evidence-supported interventions, and institutions have continued to apply for those grants.
First in the World Competition
The First in the World program, an Obama administration proposal, was funded through the appropriations process for fiscal years 2014 and 2015. Structured as a tiered evidence competition, it offered the smallest (“development”) grants to institutions with the least rigorous evidence behind their proposals; medium-sized (“validation”) grants to proposals backed by some rigorous evidence that were in need of more testing, or of testing in different settings; and the largest (“expansion”) grants to proposals that used practices proven through extensive, rigorous research to be ready to be scaled up to more campuses and students.43 In its first year, Congress appropriated $75 million to the program, and the Department made development awards to 24 institutions. In its second year, the Department awarded $60 million through 16 development grants and two validation grants.44 However, Congress did not provide any additional funding for the program, and never codified it in the Higher Education Act beyond those two appropriations cycles. No scale-up (expansion) grants were ever awarded. The grants that were awarded are nearing the end of their activities, and evaluations should be finalized in the coming years.
Citations
- Kelly Field, “Are the Right Students ‘Upward Bound?’” Chronicle of Higher Education, August 17, 2007, source">source; and Doug Lederman, “Little Change for Upward Bound,” Inside Higher Ed, September 25, 2006, source">source.
- Kelly Field, “Are the Right Students ‘Upward Bound?’” Chronicle of Higher Education, August 17, 2007, source">source.
- Ibid.
- Kelly Field, “Education Department Agrees to End Controversial Upward-Bound Study,” Chronicle of Higher Education, February 25, 2008, source">source; Kelly Field, “Senate Approves Measure Blocking Evaluation of Upward Bound,” Chronicle of Higher Education, October 19, 2007, source">source; and Kelly Field, “Are the Right Students ‘Upward Bound?’” Chronicle of Higher Education, August 17, 2007, source">source.
- Higher Education Opportunity Act, §428H, enacted August 14, 2008.
- Jennifer Ma, Matea Pender, and Meredith Welch, Education Pays 2016: The Benefits of Higher Education for Individuals and Society (New York: The College Board, 2016), source">source.
- Raj Chetty, John N. Friedman, Emmanuel Saez, Nicholas Turner, and Danny Yagan, “Mobility Report Cards: The Role of Colleges in Intergenerational Mobility,” Opportunity Insights, July 2017, source">source. The Equality of Opportunity Project is now called Opportunity Insights.
- Ibid.
- “The State of For-Profit Colleges,” Center for Responsible Lending (website), January 29, 2019, source">source; and U.S. Department of Education, “Fact Sheet: Department of Education Announces Release of New Program-Level Gainful Employment Data,” November 16, 2017, source">source.
- Lul Tesfai, Kim Dancy, and Mary Alice McCarthy, Paying More and Getting Less: How Nondegree Credentials Reflect Labor Market Inequality Between Men and Women (Washington, DC: New America, September 2018), source">source.
- Robert Kelchen, “A Look at Pell Grant Recipients’ Graduation Rates,” Brookings (website), October 25, 2017, source">source; and Jean Johnson, Jon Rochkind, Amber N. Ott, and Samantha DuPont, With Their Whole Lives Ahead of Them: Myths and Realities About Why So Many Students Fail to Finish College (Brooklyn, NY: Public Agenda, December 9, 2009), source">source.
- Advancing Diversity and Inclusion in Higher Education (Washington, DC: U.S. Department of Education, November 2016), source">source.
- U.S. Department of Education, National Center for Education Statistics, “Table 362.20: Graduation Rate from First Institution Attended within 150 Percent of Normal Time,” prepared December 2017, source">source.
- City University of New York (website), “About: What is ASAP?” source">source.
- Susan Scrivener, Michael J. Weiss, Alyssa Ratledge, Timothy Rudd, Colleen Sommo, and Hannah Fresques, Doubling Graduation Rates: Three-Year Effects of CUNY’s Accelerated Study in Associate Programs (ASAP) for Developmental Education Students (New York: MDRC, February 2015), source">source.
- MDRC, “Ohio Programs Based on CUNY’s Accelerated Study in Associate Programs (ASAP) More than Double Graduation Rates,” press release, December 2018, source">source.
- James Kvaal and John Bridgeland, Moneyball for Higher Education: How Federal Leaders Can Use Data and Evidence to Improve Student Outcomes (Washington, DC: Results for America, January 2018), source">source.
- Reshma Patel, Lashawn Richburg-Hayes, Elijah de la Campa, and Timothy Rudd. “Performance-Based Scholarships: What Have We Learned? Interim Findings from the PBS Demonstration,” MDRC policy brief, August 2013, source">source.
- Benjamin L. Castleman and Lindsay C. Page, “Freshman Year Financial Aid Nudges: An Experiment to Increase FAFSA Renewal and College Persistence,” Journal of Human Resources 51, no. 2 (June 2014): 389–415, source">source.
- Eric P. Bettinger and Rachel B. Baker, “The Effects of Student Coaching: An Evaluation of a Randomized Experiment in Student Advising,” Educational Evaluation and Policy Analysis 36, no. 1 (March 2014): 3–19, source">source.
- Eric Kelderman, “‘Higher Education’ Isn’t So Popular, Poll Finds, but Local Colleges Get Lots of Love,” Chronicle of Higher Education, May 21, 2018, source">source. For more on the issue of trust in higher education, see Rachel Fishman, Ernest Ezeugo, and Sophie Nguyen, Varying Degrees 2018:New America's Annual Survey on Higher Education (Washington, DC: New America, May 2018), source">source.
- The Promise of Evidence-Based Policymaking (Washington, DC: Commission on Evidence-Based Policymaking, September 7, 2017), source">source; and Foundations for Evidence-Based Policymaking Act of 2017, P.L. 115-435, enacted January 14, 2019.
- The Fund for Innovation and Success in Higher Education (FINISH) Act. S. 1059.
- For more on the need for evaluation in federal experiments, see Clare McCann, Amy Laitinen, and Andrew Feldman, Putting the Experiment Back in the Experimental Sites Initiative (Washington, DC: New America, January 23, 2018), source">source.
- The Promoting Real Opportunity, Success, and Prosperity through Education Reform (PROSPER) Act. H.R. 4508; introduced in the 115th Congress.
- Cheryl Blanco, Early Commitment Financial Aid Programs: Promises, Practices and Policies (Boulder, CO: Western Interstate Commission for Higher Education, August 2005), source.
- The author served in the Obama administration at the Education Department from 2015-2017.
- Margaret Cahalan, TRIO, The What Works Clearinghouse, and the Competitive Preference Priorities (CPPs): An Imposed Structured Practitioner-Research Collaboration (Washington, DC: The Pell Institute for the Study of Opportunity in Higher Education, 2018), source.
- Jill M. Constantine, Neil S. Seftor, Emily Sama Martin, Tim Silva, and David Myers, A Study of the Effect of the Talent Search Program on Secondary and Postsecondary outcomes in Florida, Indiana and Texas: Final Report from Phase II of the National Evaluation (Washington, DC: U.S. Department of Education, 2006), source.
- David Myers, Robert Olsen, Neil Seftor, Julie Young, and Christina Tuttle, The Impacts of Regular Upward Bound: Results from the Third Follow-Up Data Collection (Washington, DC: U.S. Department of Education, 2004), source. The first iteration of the same study did, however, find that students who entered Upward Bound with low expectations for their postsecondary educations were substantially more likely to complete high school. David Myers and Allen Schirm, The Impacts of Upward Bound: Final Report for Phase I of the National Evaluation (Washington, DC: U.S. Department of Education, 1999), source.
- Bradford W. Chaney, National Evaluation of Student Support Services: Examination of Student Outcomes After Six Years, Final Report (Washington, DC: U.S. Department of Education, 2010), source.
- Maureen Hoyler (Council for Opportunity in Education), public comment re: Docket ID: ED-2014-ICCD-0137-0001, November 19, 2014, source.
- U.S. Department of Education, “Fiscal Year 2019 Budget Request Congressional Justifications,” February 2018, R-109, source.
- Celeste Tarricone, “Bill Would Promise Pell Grants to Needy Sixth Graders,” Chronicle of Higher Education, July 11, 1997, source.
- Stephen Burd, “White House Plan for Low-Income Students Sparks Debate Helping Disadvantaged Students,” Chronicle of Higher Education, April 23, 1999, source.
- This is true at least of the effectiveness of the program itself; there is a near-complete evaluation of the effectiveness of sending text reminders to students in GEAR UP about deadlines for financial aid and college applications. Institute of Education Sciences (website), “Effectiveness of Promising Strategies in Federal College Access Programs: Study of College Transition Text Messaging in GEAR UP,” final report expected Spring 2020, source.
- Alberto Cabrera, Regina Deil-Amen, Radhika Prabhu, Patrick Terenzini, Chul Lee, and Robert E. Franklin, Jr., “Increasing the College Preparedness of At-Risk Students,” Journal of Latinos and Education 5, no. 2 (2006): 79–97, source.
- Using EXPLORE and PLAN Data to Evaluate GEAR UP Programs (Iowa City, IA: ACT, March 2007), source. The report was sponsored in part by the National Council for Community and Education Partnerships, a group that lobbies on behalf of GEAR UP.
- National Council for Community and Education Partnerships (website), “Evaluation Consortium,” source.
- Higher Education Act, §311.
- U.S. Department of Education (website), “Budget Tables,” 2019, source.
- Higher Education Fiscal Year 2015 Budget Request (Washington, DC: U.S. Department of Education, February 2014), T-34, source.
- U.S. Department of Education (website), “First in the World,” source. For more on tiered evidence competitions, see Andrew Feldman and Ron Haskins, “Tiered-Evidence Grantmaking,” Evidence-Based Policymaking Collaborative (website), September 9, 2016, source.
- U.S. Department of Education (website), “First in the World,” source.
Goals and Strategies for Improving the Programs
To improve the success of any of these efforts—college access programs like TRIO and GEAR UP, the minority-serving institution grant programs, or even a competition like First in the World—the programs themselves must begin to change and adapt.
First, and most importantly, both Congress and the education secretary must make gathering and using evidence a priority across all Department programs. Grantees will not adopt evidence-based practices on their own, without incentives to do so. Nor will many of them even be clear on how to interpret research, carefully and faithfully implement it on their campuses, or evaluate how the practices worked in their contexts. The Department must promote measurement, evaluation, and continuous improvement by its grantees; provide support for practitioners in understanding and implementing research-based practices; and require that taxpayer dollars be spent to further encourage those practices.
Second, the goals of these programs—particularly the grants to minority-serving institutions, where the uses of federal dollars are so sprawling—should be refocused on improving student success. A grant program that allows for over a dozen different uses is likely to lead colleges to pursue self-serving interests over ones that are most beneficial to their students, or to continue using the funds exactly as they always have. If lawmakers want to spend money to improve student success, they must make their expectations clear that grantees should take proven and promising approaches to better student outcomes.
Finally, the programs must ensure a feedback loop, a way to keep trying new things and building the body of evidence, informing campus leaders about successes and failures, and driving the widespread adoption of policies that work for students. That includes a commitment by the Department’s research arm to gathering and assessing more evidence about what works and what does not in higher education, and to communicating that information to both policymakers and practitioners in order to change practices.
First, and most importantly, both Congress and the education secretary must make gathering and using evidence a priority across all Department programs.
Fortunately, there are plenty of opportunities within the law as it exists to improve these programs and expand the use of evidence to promote students’ enrollment in and completion of college, particularly through a greater focus on student interventions. Even better, policymakers at the Education Department need not wait for Congress to reauthorize the Higher Education Act before implementing some of the easiest suggestions here. Still, HEA reauthorization presents a once-in-a-decade opportunity to reorient these programs to ensure they serve students, not grantees, and lawmakers should be careful not to let that chance pass them by.
Recommendations for the U.S. Department of Education
Continue to promote evidence-based practices: The Education Department has made significant progress in its efforts to encourage the use of evidence-based practices, incorporating priorities for research-backed interventions into many of its competitive grant competitions in higher education and PreK–12 programs in recent years. It should continue these efforts, asking that grantees use promising practices and evaluate their impact and assessing when to raise expectations for the level of rigor it expects grantees to meet based on the available research.
Additionally, the Education Department should consider how to emphasize evidence-based policymaking in the Secretary’s supplemental priorities, a grab-bag of incentives the Department can incorporate into future grant competitions. A version of the priorities released for public comment in 2017, for instance, included one for “promoting innovation and efficiency…with an increased focus on improving student outcomes.”45 While the supplemental priorities are far from the only opportunity to incorporate evidence into higher education programs, this was an unfortunate missed opportunity to emphasize the use of evidence-based strategies and efforts to measure grantees’ success in ways that can contribute to the body of research on higher education efficiency. Future iterations of the supplemental priorities, as well as regulatory changes to the TRIO, GEAR UP, and/or MSI programs, should seek to signal the Department’s commitment to having colleges use proven and promising practices to help students succeed.
Evaluate programs and practices supported under the TRIO and GEAR UP programs: The Education Department has not comprehensively evaluated the TRIO programs in recent years, presumably due to the ban on conducting randomized controlled trials within these programs. However, the agency should consider the scope of the statutory ban and continue to support allowable research. For instance, it could have evaluators identify and assess the effectiveness of some of the most common practices of TRIO grantees, to determine whether and how particular interventions improved students’ outcomes. It did so recently, with a study of college advising in Upward Bound that revealed that using “Find the Fit” strategies with students increased the number and selectiveness of colleges to which students applied.46 Or the Department could provide priorities for grantees willing to conduct rigorous assessments of their own programs. Even if the results are limited, conducting these kinds of evaluations may yield information useful to grantees and help to build support among lawmakers and key stakeholders for broader evaluations of the programs.
The Education Department should ask that grantees use promising practices and evaluate their impact.
Similarly, the Department should explore what types of evaluations are possible and useful for improving the GEAR UP program given that it is not subject to the prohibition on randomized controlled trials. While crafting large-scale evaluations has been a challenge in the past, the efforts and costs of conducting them are justified considering the stakes for the low-income students the program primarily serves. Supporting research on GEAR UP’s effectiveness is especially important given that Education Secretary Betsy DeVos cited the lack of evidence about the program’s effectiveness when she proposed eliminating the program and redirecting its funding into a new state formula grant program that would consolidate TRIO and GEAR UP funds to support college access activities. Before recommending such drastic change, the Department would be wise to determine whether or not GEAR UP works.
Improve data collection of higher education programs: The Department should ask more of grantees receiving taxpayer dollars by reevaluating its current performance report requirements and requesting additional data where needed to understand participants’ outcomes. TRIO is a model for high-quality data: the Department already makes available extensive grantee-level information on students’ outcomes within these programs. Meanwhile, there is little information available about how—and how much—minority-serving institutions are spending on various activities within their grants. The Department should audit its available data across all of its discretionary grant programs and consider what information it could make public prior to an HEA reauthorization that might inform lawmakers about necessary reforms—and what data it should begin to collect to improve its information.
Clearly publish and disseminate research relevant to grantees: The Education Department’s Institute of Education Sciences (IES) maintains the What Works Clearinghouse, which includes a list of available studies that have been reviewed according to IES evidence standards and assessed for their impact in improving student outcomes. However, the postsecondary research in that database is woefully inadequate and leaves out some of the research that exists in higher education improvement. IES should make a concerted effort to add more third-party studies and available research into the What Works Clearinghouse; and to clearly communicate to researchers how they can design studies eligible for inclusion, to ensure more qualifying research is conducted in the first place. And Department staff should drive grantees to those resources or make other resources (like brief summaries of research studies or information in the grant applications about the types of research that might meet evidence priorities, some of which it already produces) readily available and widely disseminated. Another resource the Department can access to help reach more K–12-based grantees is the Regional Education Laboratories, which conduct research and have connections with grantees and institutions across the country.
Recommendations for Congress
Consider large-scale reform to improve the efficacy of these programs: Hundreds of grantees around the country are engaged in distinct projects that are all working toward the same goal. As more research becomes available about the efficacy of certain types of interventions, models for reaching students and ensuring they feel connected, and best practices for improving college access and success, lawmakers may wish to rethink–without cutting spending on critical national efforts–how Congress finances and disseminates funds for these programs. A new version of these programs could narrow the scope of activities to those that are most effective when the evidence is robust or encourage the use of proven practices when the knowledge base is less developed. Congress should also encourage innovation to continue identifying new promising practices and the most cost-effective ways to ensure that the students most in need of additional resources get those supports. And lawmakers could award financial support to colleges or networks of colleges seeking to institutionalize the use of evidence by allowing funds to be used to identify evaluators, build data capacity, and build up institutional research staff.
However, based on the information available at this time, it is difficult to say exactly what that type of reform should look like. Greater investigation is needed, and any reforms should be informed by the types of research into the effectiveness of federally funded programs discussed throughout this paper. Congress should begin to lay the groundwork for a more robust public discussion with the other recommendations in this report.
Eliminate prior experience points: Lawmakers should do away with the antiquated and unfair system of prior experience points. The prior experience points in the TRIO program run counter to the very notion of evidence-based policymaking. A prospective grantee with a great idea backed by evidence stands little chance of winning a grant against a current grantee that gets a 15-point boost (on a 100-point scale)—more than double the point bonus used for evidence in the Department’s 2015 TRIO competition—simply for already having a grant.47
Congress should consider ways to ensure evidence-based practices are not just a priority, but the central priority.
Direct the Education Department to evaluate the effectiveness of the programs it supports: The Education Department is banned under the TRIO programs, and sometimes constrained by finances, from evaluating the effectiveness of the discretionary grant programs it supports with the most appropriate methods available. As a result, policymakers often are in the dark about what works, and grantees are unsure how they need to improve their projects. Congress should repeal the ban on conducting randomized controlled trials in the TRIO programs and direct the Department to conduct large-scale evaluations of both TRIO and GEAR UP.
Require the use of evidence in existing programs: Congress should consider ways to ensure evidence-based practices are not just a priority, but the central priority. The Department has already laid out a promising framework for incorporating evidence into existing programs and should continue to do so. However, assigning a one- or two-point priority for applicants proposing to conduct an evidence-based activity has led some applicants to propose the bare minimum, rather than taking action that would have a greater impact. For instance, the Department could require that a certain percentage of the grant dollars be reserved for the evidence-based activities the grantee proposes to incorporate. It could offer a more generous funding level for grantees that engage in the interventions with the biggest effects (typically more expensive interventions to conduct). Or it could establish a baseline share of grant spending each grantee must use on evidence-based activities.
Additionally, lawmakers should help facilitate a culture change in how grantees view these programs by rewriting the programs' purposes to emphasize the centrality of evidence-based practices. Doing so might have only modest effects, but will signal to the Education Department and grantees that Congress not only allows but expects experimentation will be a part of the program.
Increase available funding for evaluation: For a rigorous evaluation that yields usable and useful results, the Department will require funding. Those dollars can ensure that it provides grantees with technical assistance, gathers necessary data, and involves high-quality researchers. Congress should prioritize the need for such evaluations by appropriating additional funding that is earmarked for rigorous evaluations at the Institute for Education Sciences and/or under the Government Performance and Results Act (GPRA). GPRA provided hundreds of thousands of dollars each year for data collection and program evaluation in higher education until lawmakers defunded its budget completely in fiscal year 2015.48 Congress could also set aside a portion of funds for programs like the Titles III and V minority-serving institutions programs so that the Department uses that set-aside for evaluation purposes. Increased funds for measuring impact would complement the evaluation efforts already required by the Foundations for Evidence-Based Policymaking Act that became law earlier this year.49
Grant the Education Department pooled evaluation authority for its higher education dollars: As with the Education Department authority for K–12 programs established by the Every Student Succeeds Act, Congress should allow the Department to set aside a small percentage of program dollars to build up the necessary funding to evaluate its higher-education programs. Pooled evaluation authority is especially important if Congress fails to provide additional funding for evaluation through other sources and programs, because it will allow the Department to prioritize certain programs for evaluation and borrow from other funding streams to afford that research.
Improve the quality of higher education data: Congress should pass a law establishing a secure, privacy-protected student-level data network at the Education Department to improve the quality of information available to and from colleges about their outcomes. Evidence-based policymaking requires better data to understand where and with whom the greatest need exists. Data also help to benchmark the goals for grantees that are spending federal dollars and to measure improvement and progress. Yet Congress has explicitly banned the Education Department from collecting information that could fill significant gaps in federal data sources. Lawmakers must overturn that ban and direct the Department to collect and provide key information about college performance to institutions and the public. Growing momentum behind the College Transparency Act, which would direct the Department to create a student-level data network to answer critical questions about higher education, is indicative of the broad constituencies—including policymakers from across the political spectrum—clamoring for this information.
Evidence-based policymaking requires better data to understand where and with whom the greatest need exists.
Launch a new program that will continue building the body of evidence: Congress should commit to innovation and experimentation in higher education with a tiered evidence competition that strives to improve students’ chances of success in college with grants to institutions to test, replicate, and expand the most promising practices. There is a thirst among some colleges to experiment with promising new practices, and a commitment among others to evaluate what works and to expand the use of such practices. The First in the World competition may only have run for a few years, but the demand was great. In 2014, for instance, the Department received 459 eligible applications, but could fund only 24 of them using the $75 million Congress made available for the competition.50
Congress should capitalize on that demand and continue promoting evidence-based practices at more colleges by establishing a grant competition to help finance the interventions, like the Innovation Grants program in the FINISH Act would. Tiered-evidence programs, like the Education Innovation and Research (EIR) program established through the Every Student Succeeds Act in 2015 or the Maternal, Infant, and Early Childhood Home Visiting Program operated by the Department of Health and Human Services, can create much-needed momentum for the adoption of evidence-based and promising strategies. As Brookings Institution researchers wrote in a 2016 report, “applicants to traditional grant programs often lack incentives to identify and use approaches backed by strong evidence,” nor do they have much assistance in evaluating their programs or engaging in innovative and understudied efforts.51 A tiered-evidence grant solves for those problems and helps ensure that taxpayer dollars are spent as carefully as possible.
Citations
- Kelly Field, “Are the Right Students ‘Upward Bound?’” Chronicle of Higher Education, August 17, 2007, <a href="source">source">source; and Doug Lederman, “Little Change for Upward Bound,” Inside Higher Ed, September 25, 2006, <a href="source">source">source.
- Kelly Field, “Are the Right Students ‘Upward Bound?’” Chronicle of Higher Education, August 17, 2007, <a href="source">source">source.
- Ibid.
- Kelly Field, “Education Department Agrees to End Controversial Upward-Bound Study,” Chronicle of Higher Education, February 25, 2008, <a href="source">source">source; Kelly Field, “Senate Approves Measure Blocking Evaluation of Upward Bound,” Chronicle of Higher Education, October 19, 2007, <a href="source">source">source; and Kelly Field, “Are the Right Students ‘Upward Bound?’” Chronicle of Higher Education, August 17, 2007, <a href="source">source">source.
- Higher Education Opportunity Act, §428H, enacted August 14, 2008.
- Jennifer Ma, Matea Pender, and Meredith Welch, Education Pays 2016: The Benefits of Higher Education for Individuals and Society (New York: The College Board, 2016), <a href="source">source">source.
- Raj Chetty, John N. Friedman, Emmanuel Saez, Nicholas Turner, and Danny Yagan, “Mobility Report Cards: The Role of Colleges in Intergenerational Mobility,” Opportunity Insights, July 2017, <a href="source">source">source. The Equality of Opportunity Project is now called Opportunity Insights.
- Ibid.
- “The State of For-Profit Colleges,” Center for Responsible Lending (website), January 29, 2019, <a href="source">source">source; and U.S. Department of Education, “Fact Sheet: Department of Education Announces Release of New Program-Level Gainful Employment Data,” November 16, 2017, <a href="source">source">source.
- Lul Tesfai, Kim Dancy, and Mary Alice McCarthy, Paying More and Getting Less: How Nondegree Credentials Reflect Labor Market Inequality Between Men and Women (Washington, DC: New America, September 2018), <a href="source">source">source.
- Robert Kelchen, “A Look at Pell Grant Recipients’ Graduation Rates,” Brookings (website), October 25, 2017, <a href="source">source">source; and Jean Johnson, Jon Rochkind, Amber N. Ott, and Samantha DuPont, With Their Whole Lives Ahead of Them: Myths and Realities About Why So Many Students Fail to Finish College (Brooklyn, NY: Public Agenda, December 9, 2009), <a href="source">source">source.
- Advancing Diversity and Inclusion in Higher Education (Washington, DC: U.S. Department of Education, November 2016), <a href="source">source">source.
- U.S. Department of Education, National Center for Education Statistics, “Table 362.20: Graduation Rate from First Institution Attended within 150 Percent of Normal Time,” prepared December 2017, <a href="source">source">source.
- City University of New York (website), “About: What is ASAP?” <a href="source">source">source.
- Susan Scrivener, Michael J. Weiss, Alyssa Ratledge, Timothy Rudd, Colleen Sommo, and Hannah Fresques, Doubling Graduation Rates: Three-Year Effects of CUNY’s Accelerated Study in Associate Programs (ASAP) for Developmental Education Students (New York: MDRC, February 2015), <a href="source">source">source.
- MDRC, “Ohio Programs Based on CUNY’s Accelerated Study in Associate Programs (ASAP) More than Double Graduation Rates,” press release, December 2018, <a href="source">source">source.
- James Kvaal and John Bridgeland, Moneyball for Higher Education: How Federal Leaders Can Use Data and Evidence to Improve Student Outcomes (Washington, DC: Results for America, January 2018), <a href="source">source">source.
- Reshma Patel, Lashawn Richburg-Hayes, Elijah de la Campa, and Timothy Rudd. “Performance-Based Scholarships: What Have We Learned? Interim Findings from the PBS Demonstration,” MDRC policy brief, August 2013, <a href="source">source">source.
- Benjamin L. Castleman and Lindsay C. Page, “Freshman Year Financial Aid Nudges: An Experiment to Increase FAFSA Renewal and College Persistence,” Journal of Human Resources 51, no. 2 (June 2014): 389–415, <a href="source">source">source.
- Eric P. Bettinger and Rachel B. Baker, “The Effects of Student Coaching: An Evaluation of a Randomized Experiment in Student Advising,” Educational Evaluation and Policy Analysis 36, no. 1 (March 2014): 3–19, <a href="source">source">source.
- Eric Kelderman, “‘Higher Education’ Isn’t So Popular, Poll Finds, but Local Colleges Get Lots of Love,” Chronicle of Higher Education, May 21, 2018, <a href="source">source">source. For more on the issue of trust in higher education, see Rachel Fishman, Ernest Ezeugo, and Sophie Nguyen, Varying Degrees 2018:New America's Annual Survey on Higher Education (Washington, DC: New America, May 2018), <a href="source">source">source.
- The Promise of Evidence-Based Policymaking (Washington, DC: Commission on Evidence-Based Policymaking, September 7, 2017), <a href="source">source">source; and Foundations for Evidence-Based Policymaking Act of 2017, P.L. 115-435, enacted January 14, 2019.
- The Fund for Innovation and Success in Higher Education (FINISH) Act. S. 1059.
- For more on the need for evaluation in federal experiments, see Clare McCann, Amy Laitinen, and Andrew Feldman, Putting the Experiment Back in the Experimental Sites Initiative (Washington, DC: New America, January 23, 2018), <a href="source">source">source.
- The Promoting Real Opportunity, Success, and Prosperity through Education Reform (PROSPER) Act. H.R. 4508; introduced in the 115th Congress.
- Cheryl Blanco, Early Commitment Financial Aid Programs: Promises, Practices and Policies (Boulder, CO: Western Interstate Commission for Higher Education, August 2005), source">source.
- The author served in the Obama administration at the Education Department from 2015-2017.
- Margaret Cahalan, TRIO, The What Works Clearinghouse, and the Competitive Preference Priorities (CPPs): An Imposed Structured Practitioner-Research Collaboration (Washington, DC: The Pell Institute for the Study of Opportunity in Higher Education, 2018), source">source.
- Jill M. Constantine, Neil S. Seftor, Emily Sama Martin, Tim Silva, and David Myers, A Study of the Effect of the Talent Search Program on Secondary and Postsecondary outcomes in Florida, Indiana and Texas: Final Report from Phase II of the National Evaluation (Washington, DC: U.S. Department of Education, 2006), source">source.
- David Myers, Robert Olsen, Neil Seftor, Julie Young, and Christina Tuttle, The Impacts of Regular Upward Bound: Results from the Third Follow-Up Data Collection (Washington, DC: U.S. Department of Education, 2004), source">source. The first iteration of the same study did, however, find that students who entered Upward Bound with low expectations for their postsecondary educations were substantially more likely to complete high school. David Myers and Allen Schirm, The Impacts of Upward Bound: Final Report for Phase I of the National Evaluation (Washington, DC: U.S. Department of Education, 1999), source">source.
- Bradford W. Chaney, National Evaluation of Student Support Services: Examination of Student Outcomes After Six Years, Final Report (Washington, DC: U.S. Department of Education, 2010), source">source.
- Maureen Hoyler (Council for Opportunity in Education), public comment re: Docket ID: ED-2014-ICCD-0137-0001, November 19, 2014, source">source.
- U.S. Department of Education, “Fiscal Year 2019 Budget Request Congressional Justifications,” February 2018, R-109, source">source.
- Celeste Tarricone, “Bill Would Promise Pell Grants to Needy Sixth Graders,” Chronicle of Higher Education, July 11, 1997, source">source.
- Stephen Burd, “White House Plan for Low-Income Students Sparks Debate Helping Disadvantaged Students,” Chronicle of Higher Education, April 23, 1999, source">source.
- This is true at least of the effectiveness of the program itself; there is a near-complete evaluation of the effectiveness of sending text reminders to students in GEAR UP about deadlines for financial aid and college applications. Institute of Education Sciences (website), “Effectiveness of Promising Strategies in Federal College Access Programs: Study of College Transition Text Messaging in GEAR UP,” final report expected Spring 2020, source">source.
- Alberto Cabrera, Regina Deil-Amen, Radhika Prabhu, Patrick Terenzini, Chul Lee, and Robert E. Franklin, Jr., “Increasing the College Preparedness of At-Risk Students,” Journal of Latinos and Education 5, no. 2 (2006): 79–97, source">source.
- Using EXPLORE and PLAN Data to Evaluate GEAR UP Programs (Iowa City, IA: ACT, March 2007), source">source. The report was sponsored in part by the National Council for Community and Education Partnerships, a group that lobbies on behalf of GEAR UP.
- National Council for Community and Education Partnerships (website), “Evaluation Consortium,” source">source.
- Higher Education Act, §311.
- U.S. Department of Education (website), “Budget Tables,” 2019, source">source.
- Higher Education Fiscal Year 2015 Budget Request (Washington, DC: U.S. Department of Education, February 2014), T-34, source">source.
- U.S. Department of Education (website), “First in the World,” source">source. For more on tiered evidence competitions, see Andrew Feldman and Ron Haskins, “Tiered-Evidence Grantmaking,” Evidence-Based Policymaking Collaborative (website), September 9, 2016, source">source.
- U.S. Department of Education (website), “First in the World,” source">source.
- Regulations.gov (website), “Secretary’s Proposed Supplemental Priorities and Definitions for Discretionary Grant Programs,” U.S. Department of Education, docket ED-2017-OS-0078, Proposed Priority 2, source. For more on potential improvements to the supplemental priorities, see also Amy Laitinen and Clare McCann, “Comments on Education Department Supplemental Priorities for Discretionary Grant Competitions,” New America, November 13, 2017, source.
- Alina Martinez, Tamara Linkow, Hannah Miller, and Amanda Parsad, Study of Enhanced College Advising in Upward Bound: Impacts on Steps Toward College (Washington, DC: U.S. Department of Education, October 2018), source.
- U.S. Department of Education, Office of Postsecondary Education, Regulations for the TRIO Programs, 34 CFR Parts 643–647.
- Higher Education Fiscal Year 2015 Budget Request (Washington, DC: U.S. Department of Education, February 2014), T-154, source.
- Foundations for Evidence-Based Policymaking Act of 2017, Public Law 115-435, enacted January 14, 2019.
- Higher Education Fiscal Year 2017 Budget Request (Washington, DC: U.S. Department of Education, February 2016), R-94, source.
- Andrew Feldman and Ron Haskins, “Tiered-Evidence Grantmaking.” Evidence-Based Policymaking Collaborative (website), September 9, 2016, source.
Conclusion
Over the past 50 years, gaps in college attainment have remained stubbornly persistent for students of color, even as the share of college students who are non-white has increased.52 Additionally, even many public universities have increased the prices they charge to the lowest-income students; over half now charge a net price of more than $10,000 to those from families earning less than $30,000.53 It is clear that a commitment to increasing access to a quality higher education for disadvantaged students will require more than the small-scale, fragmented efforts currently happening across the nation.
The federal government must instead invest its dollars more wisely, in a coordinated network of providers engaging in promising and proven practices. It must evaluate the impact and outcomes of students engaged in those programs and readjust—reinvesting, not slashing, federal dollars—to meet the impressive challenge ahead of them. And it must make new investments in testing, measuring, and learning what works, and disseminating those interventions to providers, if it has hope of breaking a decades-long cycle of failing to maximize the potential of these federally funded programs. Students’ futures, especially for low-income students and students of color, depend on it.
Citations
- Kelly Field, “Are the Right Students ‘Upward Bound?’” Chronicle of Higher Education, August 17, 2007, <a href="<a href="source">source">source">source; and Doug Lederman, “Little Change for Upward Bound,” Inside Higher Ed, September 25, 2006, <a href="<a href="source">source">source">source.
- Kelly Field, “Are the Right Students ‘Upward Bound?’” Chronicle of Higher Education, August 17, 2007, <a href="<a href="source">source">source">source.
- Ibid.
- Kelly Field, “Education Department Agrees to End Controversial Upward-Bound Study,” Chronicle of Higher Education, February 25, 2008, <a href="<a href="source">source">source">source; Kelly Field, “Senate Approves Measure Blocking Evaluation of Upward Bound,” Chronicle of Higher Education, October 19, 2007, <a href="<a href="source">source">source">source; and Kelly Field, “Are the Right Students ‘Upward Bound?’” Chronicle of Higher Education, August 17, 2007, <a href="<a href="source">source">source">source.
- Higher Education Opportunity Act, §428H, enacted August 14, 2008.
- Jennifer Ma, Matea Pender, and Meredith Welch, Education Pays 2016: The Benefits of Higher Education for Individuals and Society (New York: The College Board, 2016), <a href="<a href="source">source">source">source.
- Raj Chetty, John N. Friedman, Emmanuel Saez, Nicholas Turner, and Danny Yagan, “Mobility Report Cards: The Role of Colleges in Intergenerational Mobility,” Opportunity Insights, July 2017, <a href="<a href="source">source">source">source. The Equality of Opportunity Project is now called Opportunity Insights.
- Ibid.
- “The State of For-Profit Colleges,” Center for Responsible Lending (website), January 29, 2019, <a href="<a href="source">source">source">source; and U.S. Department of Education, “Fact Sheet: Department of Education Announces Release of New Program-Level Gainful Employment Data,” November 16, 2017, <a href="<a href="source">source">source">source.
- Lul Tesfai, Kim Dancy, and Mary Alice McCarthy, Paying More and Getting Less: How Nondegree Credentials Reflect Labor Market Inequality Between Men and Women (Washington, DC: New America, September 2018), <a href="<a href="source">source">source">source.
- Robert Kelchen, “A Look at Pell Grant Recipients’ Graduation Rates,” Brookings (website), October 25, 2017, <a href="<a href="source">source">source">source; and Jean Johnson, Jon Rochkind, Amber N. Ott, and Samantha DuPont, With Their Whole Lives Ahead of Them: Myths and Realities About Why So Many Students Fail to Finish College (Brooklyn, NY: Public Agenda, December 9, 2009), <a href="<a href="source">source">source">source.
- Advancing Diversity and Inclusion in Higher Education (Washington, DC: U.S. Department of Education, November 2016), <a href="<a href="source">source">source">source.
- U.S. Department of Education, National Center for Education Statistics, “Table 362.20: Graduation Rate from First Institution Attended within 150 Percent of Normal Time,” prepared December 2017, <a href="<a href="source">source">source">source.
- City University of New York (website), “About: What is ASAP?” <a href="<a href="source">source">source">source.
- Susan Scrivener, Michael J. Weiss, Alyssa Ratledge, Timothy Rudd, Colleen Sommo, and Hannah Fresques, Doubling Graduation Rates: Three-Year Effects of CUNY’s Accelerated Study in Associate Programs (ASAP) for Developmental Education Students (New York: MDRC, February 2015), <a href="<a href="source">source">source">source.
- MDRC, “Ohio Programs Based on CUNY’s Accelerated Study in Associate Programs (ASAP) More than Double Graduation Rates,” press release, December 2018, <a href="<a href="source">source">source">source.
- James Kvaal and John Bridgeland, Moneyball for Higher Education: How Federal Leaders Can Use Data and Evidence to Improve Student Outcomes (Washington, DC: Results for America, January 2018), <a href="<a href="source">source">source">source.
- Reshma Patel, Lashawn Richburg-Hayes, Elijah de la Campa, and Timothy Rudd. “Performance-Based Scholarships: What Have We Learned? Interim Findings from the PBS Demonstration,” MDRC policy brief, August 2013, <a href="<a href="source">source">source">source.
- Benjamin L. Castleman and Lindsay C. Page, “Freshman Year Financial Aid Nudges: An Experiment to Increase FAFSA Renewal and College Persistence,” Journal of Human Resources 51, no. 2 (June 2014): 389–415, <a href="<a href="source">source">source">source.
- Eric P. Bettinger and Rachel B. Baker, “The Effects of Student Coaching: An Evaluation of a Randomized Experiment in Student Advising,” Educational Evaluation and Policy Analysis 36, no. 1 (March 2014): 3–19, <a href="<a href="source">source">source">source.
- Eric Kelderman, “‘Higher Education’ Isn’t So Popular, Poll Finds, but Local Colleges Get Lots of Love,” Chronicle of Higher Education, May 21, 2018, <a href="<a href="source">source">source">source. For more on the issue of trust in higher education, see Rachel Fishman, Ernest Ezeugo, and Sophie Nguyen, Varying Degrees 2018:New America's Annual Survey on Higher Education (Washington, DC: New America, May 2018), <a href="<a href="source">source">source">source.
- The Promise of Evidence-Based Policymaking (Washington, DC: Commission on Evidence-Based Policymaking, September 7, 2017), <a href="<a href="source">source">source">source; and Foundations for Evidence-Based Policymaking Act of 2017, P.L. 115-435, enacted January 14, 2019.
- The Fund for Innovation and Success in Higher Education (FINISH) Act. S. 1059.
- For more on the need for evaluation in federal experiments, see Clare McCann, Amy Laitinen, and Andrew Feldman, Putting the Experiment Back in the Experimental Sites Initiative (Washington, DC: New America, January 23, 2018), <a href="<a href="source">source">source">source.
- The Promoting Real Opportunity, Success, and Prosperity through Education Reform (PROSPER) Act. H.R. 4508; introduced in the 115th Congress.
- Cheryl Blanco, Early Commitment Financial Aid Programs: Promises, Practices and Policies (Boulder, CO: Western Interstate Commission for Higher Education, August 2005), <a href="source">source">source.
- The author served in the Obama administration at the Education Department from 2015-2017.
- Margaret Cahalan, TRIO, The What Works Clearinghouse, and the Competitive Preference Priorities (CPPs): An Imposed Structured Practitioner-Research Collaboration (Washington, DC: The Pell Institute for the Study of Opportunity in Higher Education, 2018), <a href="source">source">source.
- Jill M. Constantine, Neil S. Seftor, Emily Sama Martin, Tim Silva, and David Myers, A Study of the Effect of the Talent Search Program on Secondary and Postsecondary outcomes in Florida, Indiana and Texas: Final Report from Phase II of the National Evaluation (Washington, DC: U.S. Department of Education, 2006), <a href="source">source">source.
- David Myers, Robert Olsen, Neil Seftor, Julie Young, and Christina Tuttle, The Impacts of Regular Upward Bound: Results from the Third Follow-Up Data Collection (Washington, DC: U.S. Department of Education, 2004), <a href="source">source">source. The first iteration of the same study did, however, find that students who entered Upward Bound with low expectations for their postsecondary educations were substantially more likely to complete high school. David Myers and Allen Schirm, The Impacts of Upward Bound: Final Report for Phase I of the National Evaluation (Washington, DC: U.S. Department of Education, 1999), <a href="source">source">source.
- Bradford W. Chaney, National Evaluation of Student Support Services: Examination of Student Outcomes After Six Years, Final Report (Washington, DC: U.S. Department of Education, 2010), <a href="source">source">source.
- Maureen Hoyler (Council for Opportunity in Education), public comment re: Docket ID: ED-2014-ICCD-0137-0001, November 19, 2014, <a href="source">source">source.
- U.S. Department of Education, “Fiscal Year 2019 Budget Request Congressional Justifications,” February 2018, R-109, <a href="source">source">source.
- Celeste Tarricone, “Bill Would Promise Pell Grants to Needy Sixth Graders,” Chronicle of Higher Education, July 11, 1997, <a href="source">source">source.
- Stephen Burd, “White House Plan for Low-Income Students Sparks Debate Helping Disadvantaged Students,” Chronicle of Higher Education, April 23, 1999, <a href="source">source">source.
- This is true at least of the effectiveness of the program itself; there is a near-complete evaluation of the effectiveness of sending text reminders to students in GEAR UP about deadlines for financial aid and college applications. Institute of Education Sciences (website), “Effectiveness of Promising Strategies in Federal College Access Programs: Study of College Transition Text Messaging in GEAR UP,” final report expected Spring 2020, <a href="source">source">source.
- Alberto Cabrera, Regina Deil-Amen, Radhika Prabhu, Patrick Terenzini, Chul Lee, and Robert E. Franklin, Jr., “Increasing the College Preparedness of At-Risk Students,” Journal of Latinos and Education 5, no. 2 (2006): 79–97, <a href="source">source">source.
- Using EXPLORE and PLAN Data to Evaluate GEAR UP Programs (Iowa City, IA: ACT, March 2007), <a href="source">source">source. The report was sponsored in part by the National Council for Community and Education Partnerships, a group that lobbies on behalf of GEAR UP.
- National Council for Community and Education Partnerships (website), “Evaluation Consortium,” <a href="source">source">source.
- Higher Education Act, §311.
- U.S. Department of Education (website), “Budget Tables,” 2019, <a href="source">source">source.
- Higher Education Fiscal Year 2015 Budget Request (Washington, DC: U.S. Department of Education, February 2014), T-34, <a href="source">source">source.
- U.S. Department of Education (website), “First in the World,” <a href="source">source">source. For more on tiered evidence competitions, see Andrew Feldman and Ron Haskins, “Tiered-Evidence Grantmaking,” Evidence-Based Policymaking Collaborative (website), September 9, 2016, <a href="source">source">source.
- U.S. Department of Education (website), “First in the World,” <a href="source">source">source.
- Regulations.gov (website), “Secretary’s Proposed Supplemental Priorities and Definitions for Discretionary Grant Programs,” U.S. Department of Education, docket ED-2017-OS-0078, Proposed Priority 2, source">source. For more on potential improvements to the supplemental priorities, see also Amy Laitinen and Clare McCann, “Comments on Education Department Supplemental Priorities for Discretionary Grant Competitions,” New America, November 13, 2017, source">source.
- Alina Martinez, Tamara Linkow, Hannah Miller, and Amanda Parsad, Study of Enhanced College Advising in Upward Bound: Impacts on Steps Toward College (Washington, DC: U.S. Department of Education, October 2018), source">source.
- U.S. Department of Education, Office of Postsecondary Education, Regulations for the TRIO Programs, 34 CFR Parts 643–647.
- Higher Education Fiscal Year 2015 Budget Request (Washington, DC: U.S. Department of Education, February 2014), T-154, source">source.
- Foundations for Evidence-Based Policymaking Act of 2017, Public Law 115-435, enacted January 14, 2019.
- Higher Education Fiscal Year 2017 Budget Request (Washington, DC: U.S. Department of Education, February 2016), R-94, source">source.
- Andrew Feldman and Ron Haskins, “Tiered-Evidence Grantmaking.” Evidence-Based Policymaking Collaborative (website), September 9, 2016, source">source.
- Advancing Diversity and Inclusion in Higher Education (Washington, DC: U.S. Department of Education, November 2016), source.
- Stephen Burd, “Public University Trends,” Undermining Pell: Volume IV (Washington, DC: New America, October 2018), 11–12, source.