Findings

Federal expectations for P3 were high. The program, developed in response to what federal leaders had learned on the ground about the challenges of serving opportunity youth, was a long time coming. Leaders at federal agencies saw the flexibilities P3 offered as a new and exciting way to innovate and hoped it would catalyze systems change at the selected sites. The option to blend federal funds represented a major departure from the usual rules. In theory, it could allow sites to collaborate across organizations in ways that federal funding restrictions and reporting requirements normally limit. The opportunity to waive programmatic requirements like eligibility criteria is more common in some federal programs than others. Packaged together, these flexibilities made for a formidable toolkit, and the federal agencies expected to receive innovative and bold proposals in response.

Ultimately, P3 didn’t live up to these lofty expectations, particularly when it came to the program’s goals around systems change. P3’s results over the past decade demonstrate that while flexibility can help local programs address pain points in service delivery, it is not a lever for sustained systems change. This does not mean, however, that it stymies systems change. Los Angeles’s P3 efforts, for example, resulted in a new hub called the ReLAY Institute,1 which builds partnerships and organizational capacity, shares data and information, and promotes innovative practices with the goal of “transform[ing] service delivery systems” that support disconnected youth.2 Broward County and Hartford both used P3 to build an integrated data system,3 which is often key to breaking down silos between systems. These interventions proved that systems change efforts could leverage flexibility offerings to advance systems-level work. But flexibility on its own is not a forcing mechanism to create systems change.

There are four key reasons why P3 was unable to fully capitalize on the goals federal leaders set out to achieve, especially regarding systems change.

1. Sites Lacked Buy-In and Understanding of P3’s Flexibilities

Federal agencies were largely disappointed with the proposals they received from prospective P3 sites in the early rounds. Many of the applications requested waivers for adjustments that sites could already make without P3. Most did not touch the blending flexibility, which was theoretically the most novel aspect of P3. Agency staff were perplexed. Why were applicants requesting unnecessary waivers? Why did they forego some of the flexibilities entirely?

Our research found that this usually occurred because site staff and federal agency staff had different understandings of what was already allowed under the funding streams included in P3. This accounted for the unnecessary waiver requests in P3 applications. Sites thought they needed special permission from the federal government to use the money in a particular way; the agency’s interpretation of the law said that sites could already do so without a waiver.

These misconceptions often had murky origins. Sometimes, sites inadvertently imposed nonexistent restrictions around federal funds. In these instances, sites had organizational norms or rules that constrained the use of federal dollars to ensure total compliance with the law. Staff assumed the federal government imposed these rules. They did not realize the limitations were of their own making until they submitted a P3 application and heard from federal agencies that the issue they wanted a P3 waiver to resolve was already allowable.

At the same time, sites didn’t have much practice with the kind of creative thinking the federal agencies expected. Pilot sites were used to operating in a compliance-first environment when it came to federal dollars. In many cases, sites struggled to believe that some P3 flexibilities were allowable and would not result in a penalty. According to Mathematica, four of the nine first-round sites said “they were unable to secure enough trust and buy-in from their state and local partners to implement their planned approaches.”4

The blending flexibility, in particular, was a nonstarter for most sites. Mathematica researchers found that staff at many pilot sites were not familiar with blending and did not know how to use that flexibility to support their work.5 A person at one site told us that if blending had been a requirement, the site’s project would have failed because blending was simply too difficult. Another said that their organization had no accounting mechanism to blend and that trying to create one would take more time and effort than it was worth. Some also saw blending as a threat to their work: Staff from one P3 site described blending as “dangerous” because it could cause confusion during an audit, which could, in turn, impact their ability to secure future grants.

These concerns meant that site staff generally took advantage of the “safer” flexibilities, like braiding funding and waivers, rather than the more innovative aspects of blending, which they found too risky to pursue.

But when sites did cultivate buy-in for P3, it proved impactful in their communities. In Los Angeles, for example, city leaders actively advocated for P3 to build support among their core partners and a broad coalition of public, nonprofit, and philanthropic organizations.6 Securing this buy-in early paid dividends as the P3 work unfolded. The city was able to convene over 50 partner agencies around a goal of creating a stronger, more cohesive youth services system city-wide, crafting a strategic plan to guide their efforts.7 Staff participated in work groups aimed at advancing key plan objectives, held regional collaborative meetings to support information-sharing, and developed a revised intake process for youth centers that more easily connected youth with education, job training, and mental health services, among other services.8

These efforts reshaped how partners from different systems worked together, and Los Angeles became one of only two sites in the first round that saw sustained systems change. In other words, the Los Angeles site did not simply make changes to how one particular organization serves youth, but rather to how the entire network of youth service providers functions and how partners within it communicate. Today, Los Angeles is the only site from the original three rounds with an active P3 authority.

The discrepancy between federal expectations and site realities highlights a tension at the heart of P3: Simply providing the flexibility was not enough for sites to take advantage of it—or to drive systems change, P3’s stated goal. Minor adjustments often eased barriers to effective service delivery, even if they did not lead to actual system changes.

2. P3 Did Not Include Planning Time Before Implementation or Technical Assistance for Proposal Development

Successful systems change requires partners across different systems to spend time developing a shared vision before implementation of their effort begins. But P3’s structure did not build in planning time for sites in advance of their P3 intervention, which meant sites not already engaged in systems change work struggled to use P3 as a way to foster lasting systems change, though some were able to take initial steps.

Mathematica found that only two sites in the first round—Los Angeles, California, and Broward County, Florida—were able to foster sustained systems change through P3.9 Los Angeles focused on building a new youth service delivery system that was coordinated across all partners, while Broward County used P3 to relaunch efforts to build an integrated data system.10 In both places, partners had engaged in cross-system planning efforts before P3, and once they received the P3 authority, they used it as an opportunity to set explicit systems change goals.11 The other sites, where cross-system partnerships were not as developed, generally approached P3 from a programmatic perspective and didn’t use it as a tool to support systems change.12

Sites also could not meaningfully workshop and refine their P3 proposals, including their requested waivers, with federal agency staff. Sites submitted applications, which were then peer-reviewed by federal staff. For the applications selected to move ahead, the requested waivers and flexibilities went through an extensive approvals process that included sign-off from the relevant agency secretaries. Once applications were submitted, the sites had little opportunity to get feedback from the agencies and adjust their proposal based on that input. Sites unsure how to use P3 as a vehicle for systems change submitted more programmatic proposals and could not revise their plans to focus more on systems change elements.

The lack of a planning period and limited proposal development technical assistance also exacerbated the problems discussed in the finding above. Without time to cultivate buy-in and support to craft more ambitious, creative proposals, sites struggled to submit applications that met the federal government’s high expectations.

3. Program Design Elements Prioritized Programmatic Flexibility Over Systems Change

In the initial notice inviting applications to P3 in the Federal Register, the government offered several examples of potential pilot projects.13 All were related to collaboration, alignment, or partnerships across systems that serve disconnected youth. But of the 14 sites in the first three rounds of P3, only Los Angeles and Broward County set goals relating to system change.14

Outside of these two places, most sites’ P3 interventions focused on improving program delivery for youth populations. The New York City pilot, for instance, extended the time period for which parenting youth could receive WIOA services and provided them with a case manager who could connect them to free or subsidized child care.15 Active at two sites in Brooklyn, this intervention was aimed at improving the services that parenting youth received rather than transforming how youth services operate citywide.16

Despite ambitious goals around breaking down silos and promoting systems alignment, P3’s design placed more emphasis on providing programmatic flexibility than on creating systems change. For example, sites had to track youth education and workforce outcomes, but not progress on data-sharing, collaboration, or how agencies and institutions work together.17 Mathematica researchers hypothesized that this sent a message to the sites about which goals the government cared most about, incentivizing a focus on service delivery over systems-oriented efforts.

Another barrier to systems change resulted from P3’s time-limited authority, which didn’t incentivize the long-term, structural changes needed to foster systems alignment. Pilot sites were granted P3 flexibilities for roughly two years, though Congress later created an option for a five-year extension if a site demonstrated strong performance.18 But once the P3 term was up, the flexibilities would expire. This made it hard for sites to commit to long-term systems change efforts that depended on the P3 flexibilities. There was little motivation for sites to engage in a major systems alignment project that might be rendered unusable as soon as their P3 term was over.

Indeed, most of the site staff we spoke with said that the changes they implemented during P3 did not last once their P3 authority expired. We found some exceptions, like Los Angeles, but the city’s cross-system coordination and planning was happening before it became a P3 pilot and was not reliant on P3 flexibilities like waivers to continue.

While most staff we interviewed reported that the P3 changes did not last at their site, many did say that P3 either created or strengthened relationships between local partners across different systems. Improved connections have continued after P3 and have enhanced local collaboration.

4. P3 Phased Out Grant Support, Reducing Appeal to Potential Applicants

P3 is a relative rarity in this era of political polarization: It enjoys bipartisan support from federal policymakers. P3 owes its popularity and longevity, in part, to its low price tag: zero dollars.

Because Congress has never appropriated any funds to support P3’s implementation,19 federal agencies involved in P3 have had to find money for it elsewhere in their budgets. Federal staff assigned to develop the notice inviting applications, evaluate applications, provide technical assistance, and implement the evaluations had to find time for this work in their normal course of duties. Without dedicated staff, coordinating across agencies became a challenge. “It was a tax on the agencies,” one former federal employee told us.

The agencies also had to find the money for the $700,000 grants provided to pilot sites to support implementation and evaluation. Because there were no appropriations for the grants, ED created the grants by pooling funding from programs across the involved agencies. Taking money from already sparsely funded programs proved difficult to sustain. The agencies reduced the grant size in the second and third rounds of P3, before phasing it out entirely in the fourth round, along with the evaluation requirement. Without evaluations, the federal government lost the ability to learn the extent to which P3 was working.

The elimination of the site grants delivered a blow to interest in the program. Since the third round—the last with grants—no sites with pilots that involved programs from more than one federal agency have been selected,20 and the federal government has received very few applications despite regular solicitation.

All of the site staff we spoke with said that the grant was vital, and that without it, they would not have applied or been able to implement their proposed P3 intervention. One site leader said that the grant did not cover all staff capacity needed for the project, and colleagues had to work long hours to fill the gaps. Another told us their site had started brainstorming proposals for future rounds of P3 but abandoned their plans once they found out it would not come with a grant to fund the staff capacity needed to implement the proposal.

These strong feelings were, in part, due to the evaluation element of P3. For the first three rounds of P3, sites had to propose an evaluation of their P3 intervention in their application, and then, if selected, fund it using their grant dollars. In our interviews, site staff reported that the evaluation requirements were onerous. Applications got more points for proposing rigorous experimental evaluations—high-quality studies—but these are expensive and hard to execute. Not all sites had the same level of in-house evaluation expertise; some proposed evaluations beyond what they could perform. Without the P3 grant funding (and considerable evaluation technical assistance from Mathematica), site staff said, they would not have been able to implement the evaluation requirements.

Citations
  1. ReLAY Institute (website), source.
  2. Brown, Performance Partnership Pilots for Disconnected Youth, 9, source.
  3. Stanczyk, Yañez, and Rosenberg, Performance Partnership Pilots for Disconnected Youth (P3): Implementation Study, 11–12, source.
  4. Brown, Performance Partnership Pilots for Disconnected Youth, 28, source.
  5. Stanczyk, Yañez, and Rosenberg, Performance Partnership Pilots for Disconnected Youth (P3): Implementation Study, 27, source.
  6. Brown, Performance Partnership Pilots for Disconnected Youth, 6–11, source.
  7. Brown, Performance Partnership Pilots for Disconnected Youth, 6–11, source.
  8. Brown, Performance Partnership Pilots for Disconnected Youth, 6–11, source.
  9. Brown, Performance Partnership Pilots for Disconnected Youth, source.
  10. Brown, Performance Partnership Pilots for Disconnected Youth, 4, source.
  11. Brown, Performance Partnership Pilots for Disconnected Youth, 4, source.
  12. Brown, Performance Partnership Pilots for Disconnected Youth, source.
  13. “Applications for New Awards; Performance Partnership Pilots,” 79 Federal Register 70034, November 24, 2014, source.
  14. Brown, Performance Partnership Pilots for Disconnected Youth, 4, source.
  15. Theresa Anderson, Alex Carther, and Alan Dodkowitz, Supporting Young Parents: Impacts of the New York City Performance Partnership Pilot (P3) on Young Parents’ Outcomes (Urban Institute, March 2022), source.
  16. Anderson, Carther, and Dodkowitz, “Supporting Young Parents: Impacts of the New York City Performance Partnership Pilot (P3) on Young Parents’ Outcomes,” source.
  17. Stanczyk, Yañez, and Rosenberg, Performance Partnership Pilots for Disconnected Youth (P3): Implementation Study, 40, source.
  18. Because P3 was tied to the use of a particular year’s appropriation, the flexibility was effective only for the time during which the appropriation could be spent. This period varied by program but was no longer than 27 months. The agencies developed some workarounds for that limitation, and Congress later gave them the authority to extend pilots for up to five years if they demonstrated strong performance, but the period in which pilots were expected to be implemented was still relatively brief.
  19. Stanczyk, Yañez, and Rosenberg, Performance Partnership Pilots for Disconnected Youth (P3): Implementation Study, 3, source.
  20. As noted in our “Implementation of P3” section, the federal government approved some single-agency P3 pilots, either through ED and DOL, after round three. We do not focus on them in this analysis because they had less emphasis on breaking down federal funding silos.

Table of Contents

Close