<link rel="stylesheet" type="text/css" href="https://newamericadotorg-static.s3.amazonaws.com/static/css/newamericadotorg.min.css"></link>

Measuring Up

On January 29, the U.S. Department of Education released a blueprint for how it plans to revise the gainful employment (GE) regulations, which the Obama administration put in place in 2014. Most notably, the Department’s proposed rule would eliminate all sanctions for career-oriented programs that leave students with large debt but without the training to land a well-paying job after graduation. Preserving only a modified version of the current disclosure requirements, the regulations could be further weakened if for-profit colleges get their way during the second round of negotiations. Instead of disclosing or holding career-oriented college programs accountable for the amount of debt that graduates borrow relative to the amount they earn a few years after completing, as the current rules do, for-profit college leaders and lobbyists have called for substituting actual students’ earnings with local estimates derived from the Bureau of Labor Statistics (BLS). While the Department’s proposal to strike any consequence from the GE regulations may seem brazen in comparison, attempting to use BLS data in place of actual graduates’ earnings would have nearly the same impact as no accountability at all. Unfortunately, using BLS estimates instead of real earnings data would not only tell prospective students very little about the quality of the program that they are considering, it will actively mislead them. More troubling still, this approach would prevent the government from holding individual colleges accountable. 

To illustrate just how misleading it would be to use BLS data for the purpose of measuring program outcomes, we compared national and local BLS earnings with actual earnings from graduates of specific career-training programs. We found that, on average, the median annual earnings for graduates of all programs subject to the gainful employment regulations were $27,494. But if local BLS estimates were used instead, the median annual earnings would rise to an average of $49,341—an increase of $21,847, or nearly 80 percent. 

Even after excluding earnings for graduates of cosmetology and massage therapy programs, figures that some cosmetology college owners have argued are not accurate because they do not include unreported tip income that workers in these fields receive, using BLS data still inflates earnings by an average of $23,530 for graduates of all other programs. Overall, we found that in 96 percent of the programs analyzed, graduates’ actual earnings were lower than the median BLS earnings, although the gaps varied by field of study. Of the 10 most common fields reflected in the GE data, business administration programs had the largest gap between real earnings for graduates and their corresponding local estimates. On average, this difference stood at $63,824, or 150 percent. Licensed practical nursing programs had the narrowest gap between real graduates’ earnings and local BLS estimates, a difference of around $6,600, or 18 percent. 

Our analysis shows that using BLS data, rather than actual earnings, would undermine the very purpose of the GE rule, which is to provide meaningful federal oversight of career-training programs. According to the Education Department, under the existing rules, 803 programs failed the debt-to-earnings measures in 2017. But by replacing actual earnings data with the higher of the average or median BLS earnings, the number of programs that fail these tests would drop significantly to only 21. Likewise, the number of “zone” programs, those with outcomes that were unsatisfactory but not failing, would drop from 1,239 to 124. 

Using BLS data in the GE calculations would bury any indication of program quality and would undermine federal efforts to hold specific college programs accountable for failing to provide real career options for their graduates.