Profiles in Improvement: How Illinois Developed a Data System to Advance Teacher Preparation

Brief
Shutterstock
March 29, 2023

Introduction

Not long ago, teacher preparation programs in Illinois placed phone calls to their alumni in an attempt to learn which were working in full-time teaching positions (and where), even though Illinois’s State Board of Education collects public school teacher employment data. Recognizing these types of inefficiencies, and the benefits of providing a more holistic view of the state’s teacher preparation programs for stakeholders, the State Board developed the Illinois Educator Preparation Profile (IEPP) online platform.

Two years into its existence, the IEPP is already helping to provide insights on the state’s preparation programs to three essential stakeholder groups: prospective teachers, K–12 school hiring managers, and, most importantly, the preparation programs themselves. By offering useful data, and providing performance designations tied to them, Illinois hopes to support programs in making improvements and to support prospective teachers and K–12 hiring managers in making enrollment and hiring decisions that serve them and the state’s students well.

Other states can learn from Illinois’s efforts to build a system that focuses stakeholders on key elements of ensuring a robust, diverse, and well-prepared novice teacher workforce.

Why Do States Need Data on Their Novice Teacher Pipelines?

The need for data on the teacher pipeline has never been as critical as it is right now. The COVID-19 pandemic upended many aspects of public life, including public schools, where staffing challenges that had once been confined to specific types of schools or subjects ballooned. Many local education agencies (LEAs) have struggled to find suitable replacements when teachers depart, in part because of challenges that preparation programs have had in recruiting a sufficiently large and adequately diverse cohort of prospective new teachers. To fill teacher vacancies, schools report they have to increasingly employ individuals with little to no preparation or subject knowledge, despite research indicating that these are critical for new teachers’ success in the short and long term.[1]

State leaders want to understand how to encourage and support teacher preparation programs (TPPs) in developing a diverse and robust crop of new teachers that are successful in their roles and remain in the profession. States also seek to understand whether there are specific preparation pathways and/or teacher candidate attributes that are most likely to lead to these outcomes. But it is difficult to understand the true condition of prospective teacher pipelines without useful, reliable, and transparent data systems, and most states do not have these currently.[2]

What Kind of Data Do States Already Have and What Else Is Needed?

Every state has some data available on TPPs. In the 1998 reauthorization of the federal Higher Education Act, Congress inserted requirements in Title II for states and institutions of higher education (IHEs) to annually report data on their TPPs. This move was rooted in the logic that quality teaching is a prerequisite for student success, and that substantive pre-service preparation is essential for quality teaching.[3] The new data collection focused on the requirements for entry into and completion of TPPs, along with some demographic data on candidates enrolled in the program.

In 2012, the Obama administration attempted to employ the federal negotiated rulemaking process to, among other things, encourage stakeholders to append the set of Title II metrics to include TPP completers’ employment outcomes. The administration argued that these outcomes—such as job placement rates, job performance, and short-term retention in the profession—were more relevant to the strength of the teaching profession and students’ educational success than the attributes of the TPPs themselves, such as GPA requirements for entry. The proposal led to vigorous debate among the various stakeholders involved, primarily due to pushback from TPPs on the administration’s plan to have states publicly rate and hold TPPs accountable based on the data. But there were also concerns about the validity of employment outcome data, as well as the feasibility of reporting them. These types of metrics would require states to not only have trustworthy data systems at the early childhood, K–12, postsecondary, and workforce levels, but to have a process for aligning and sharing data across all of them.

The administration’s regulatory efforts ultimately failed, and temporarily decreased political appetite at the federal level for revisiting the topic of teacher preparation data reporting. But some state education agencies saw value in several aspects of the administration’s proposal. With the aid of federal State Longitudinal Data System grant funds, leaders from Delaware, Massachusetts, New Jersey, and Rhode Island, among others, developed data systems to help identify which TPPs were sufficiently meeting the needs of the teachers they prepare and the schools that employ them, and to encourage ongoing improvement at all of them.

The Illinois Educator Preparation Profiles Platform

Drawing on this work, in December 2020, Illinois publicly launched a new TPP data reporting platform, the Illinois Educator Preparation Profiles. The story of how the IEPP came to fruition—from its initial stakeholder committee discussions in 2016 to its role in state reauthorization decisions in 2022—offers insights that can guide other states. But first, it is necessary to understand the IEPP in its current form.

For every IHE-based TPP in Illinois—at the undergraduate, graduate, and post-baccalaureate levels, and in both traditional and alternative formats—the IEPP reports data on 11 indicators across four equally weighted domains:

1) Candidate Selection and Completion, which evaluates the academic strength and racial/ethnic diversity of candidates who enroll in each TPP program, as well as whether candidates who are members of group underrepresented in teaching (i.e., racial/ethnic minority, first-generation college attendance, and/or low-income socioeconomic status) are completing programs within the standard program length.

2) Knowledge and Skills for Teaching, which evaluates how well candidates perform on state teaching assessments and how well they believe their program has prepared them for the classroom.

3) Performance as Classroom Teachers, which evaluates how program completers are faring on the job.

4) Contribution to State Needs, which identifies the extent to which program completers are employed in Illinois schools and early learning settings and their persistence in the profession, with an emphasis on the extent to which completers are employed and persisting in high-need schools.

To collect this information, the Illinois State Board of Education (ISBE) draws from its own data sets, from data shared by the Illinois Department of Employment Security and national standardized test vendors, and from reports that TPPs and LEAs provide directly. For each indicator, data are compared to a minimum standard and a state target, and assigned a score based on performance relative to these benchmarks (see Figure 1). [4]

Figure 1. Scoring of IEPP Indicators

None
Source: Adapted from ”Illinois Preparation Profile Continuous Improvement and Accountability System,” Partnership for Educator Preparation, Illinois State Board of Education, October, 2020, p. 13.

Within each IEPP domain, the indicator scores are aggregated based on their relative “indicator weights” to create an overall domain score. Then all four domain scores are aggregated into a total score (see Figure 2).

Based on these scores, each TPP receives a publicly reported performance designation for each domain, as well as an overall designation (see Figures 3 and 4).

Figure 3. IEPP Performance Designation Categories

None
Source: Adapted from “Illinois Preparation Profile Continuous Improvement and Accountability System,” Partnership for Educator Preparation, Illinois State Board of Education, October 2020, p. 13.

Figure 4. Public-Facing Score Report Breakdown for an Individual Preparation Program

None
Source: Illinois State Board of Education, Screenshot of Individual TPP Performance Report from the IEPP Public-facing Data Platform, https://apps.isbe.net/epp/public#/institutions.

ISBE publicly released its preparation program profiles for the first time in December 2020, but the overall program performance designations were only for informational purposes, in order to build stakeholder understanding of the system and to address any unforeseen issues. For example, the 2020 release brought ISBE’s attention to the fact that the designations undervalued the role that some preparation programs play in preparing teachers for employment in non-public schools or early learning settings, an issue that was rectified in the 2021 IEPP release. The February 2023 release of the 2022 reports was the first time that a program’s IEPP designation could influence its reauthorization status, as outlined in Figure 5 below.

On the “About” page for its public IEPP profile, each TPP can include narrative information it wants to share with prospective teachers, LEAs, and school hiring staff on “what makes us special,” details on the clinical experience(s) offered, and an explanation of its approach to preparing candidates for success in the classroom. Additionally, beginning with the 2022 IEPP release, TPPs can provide additional context for their reauthorization status. This section was included at the request of TPPs that desire to offer details that affected their status, but are not evident from their data profile, such as serving a small number of candidates or being a brand-new program.

In addition to offering insights into individual TPPs, the IEPP also provides a view of the statewide teacher preparation landscape to inform policymakers and advocates.

Taking the IEPP from Vision to Reality

How did ISBE move from the initial 2016 vision to the complex, yet user-friendly, version of the IEPP available today?

When ISBE decided to pursue a better system for providing data on its TPPs, it did so with the long-term goals of promoting continuous improvement to strengthen teacher preparation statewide and ensuring all new teachers are learner-ready the first day they lead a classroom. ISBE recognized that it couldn’t do this work alone. With financial support from the Joyce Foundation and thought partnership from Advance Illinois, Education First, Teacher Preparation Analytics, and a group of state education agencies convened by the Schusterman Family Foundation to discuss data systems for supporting TPP improvement, ISBE created a vision and initial goals for such a system.[6] A result of this vision was ISBE’s creation of the Partnership for Educator Preparation (PEP)—a steering committee of roughly 20 state teacher preparation stakeholders selected to ensure representation from a diverse set of stakeholder groups as well as familiarity with teacher data. The committee included teacher candidates; practicing teachers, including representatives of the state teacher unions; school principals; hiring managers at LEAs, including those that were serving high-need student populations; IHE and TPP staff who would use the data for decision-making; and other experts on the TPP data already available in the state.

The initial PEP meetings were facilitated by two national experts on teacher preparation and data, Michael Allen and Charles Coble of Teacher Preparation Analytics. Allen and Coble stressed that the ultimate goal of developing a new TPP data system was to promote and support program improvement, even though increasing transparency about TPPs for prospective teachers and hiring managers was an additional goal. The facilitators encouraged members to reflect on a teacher candidate’s journey from program entry to becoming a teacher of record, and focus on the elements with an evidence-based connection to candidates’ outcomes and their future students’ outcomes. Allen and Coble also made it clear that the objective in developing the TPP performance designation system was to be normative, not comparative. This was an attempt to promote improvement, not to rank programs relative to each other.

Brad White, a PEP member from Lewis and Clark Community College, said that the committee tried to start with a blank slate for developing the metrics and weights for the new system, but the ideation process was difficult. The committee rather quickly found it necessary to move to more of a straw man approach. When the facilitators made this shift, White shared that they made it clear that, “‘we're going to put measures out there, folks, but you can critique them and let us know if you have something better.’ And I think that worked really well. Throughout the process, I feel like there was a lot more agreement about everything than I expected there to be going in. It seemed like everyone was really trying to think about how we can use data to drive and change what we're doing within our programs and improve the way that we're preparing our prospective teachers.”

Throughout the work, the three principles of “fair, clear, and supported” guided the committee’s decisions:

  1. Fair: measures program performance fairly and provides metrics and program context so that it is not biased against programs based on demographics.
  2. Clear: indicates program performance in a way that is understandable to program staff, P​K–12 educators, prospective candidates, and the public.
  3. Supported: provides equitable supports to programs based on their context.​​​

By the late summer of 2016, the PEP steering committee had developed its final recommendations for the new data reporting system, using available research and the three principles above in guiding the selection of metrics and weights. It also made decisions regarding how the data should be displayed and to whom. For example, because most IHEs house multiple TPPs focused on various instructional areas (e.g., elementary education, middle-grade science, early childhood special education, etc.) with different faculty, curricula, and requirements, the committee determined that the IEPP should collect and report data for each TPP rather than for the overall IHE entity, in order for the information to be useful for program improvement efforts.

Another way the committee tried to focus the system on program improvement was by creating two versions of the IEPP: one for the public and one for the IHEs housing TPPs. The IHE-facing version was designed to allow administrators to access additional and more in-depth reports and analysis of data across their portfolio of TPPs so they could more easily explore strengths and weaknesses and identify strategies for improvement (see Figure 6 for an example).

Figure 6. IEPP Institutional View, Candidate Pass Rates on Relevant Licensure Subject Exam

Source: Illinois State Board of Education, Screenshot of Institution-Wide TPP Report from the IEPP Institution-Facing Data Platform.

In the fall of 2016, 36 TPPs, representing three-quarters of the state’s teaching candidates, applied to be part of the initial IEPP “mini pilot.” After making adjustments under the guidance of the PEP steering committee, ISBE launched a two-year statewide pilot in the fall of 2017 with all 58 IHEs.

Modeling A Commitment to Learning and Improving

ISBE has modeled the growth mindset and commitment to ongoing improvement that it promotes for its TPPs through a continued focus on updating the IEPP to better realize its stated goals.

Throughout the piloting process, ISBE and the PEP steering committee devised solutions to unanticipated issues. For example, when many programs had insufficient numbers of program candidates to reliably report data on some metrics, the committee decided that the default would be to pull data from multiple TPP cohorts in order to keep more metrics in play.[7] At the request of its IHEs, ISBE also chose to align the IEPP data reporting periods and timeline with that of the federal HEA Title II data collection process so IHEs could use the same data for both.

At the end of the two-year pilot, in the fall of 2019, ISBE officially replaced the previous annual data reporting structure with the new IEPP. At this time, ISBE only shared the results with TPPs, not with the public, to provide time for both ISBE and the TPPs to make any necessary course corrections. ISBE also engaged North Third, an education data operationalization consultant, to conduct surveys, interviews, and platform testing with IHEs to assess their ability to collect, validate, and use the data in the IEPP.

Despite complications from the COVID pandemic, the public-facing version went live in late 2020. Again, ISBE conducted surveys and other user research, this time with K–12 hiring managers and teacher candidates, in addition to IHEs. As a result of stakeholder feedback, ISBE decided to add the “About” section to each program profile so TPPs could offer descriptive information for prospective candidates and employers to help contextualize their programs and highlight important distinctions between them.

In early 2022, ISBE again worked with North Third to conduct additional surveys and interviews with the IEPP’s three primary audiences—TPPs, school and LEA hiring staff, and prospective educators—to understand if and how they were using the data platform relative to PEP’s initial expectations (see Figure 7) and assess what was working well and what could be improved.[8]

Figure 7. IEPP Homepage Describes Potential Uses for Each Target Audience

None
Source: Illinois Educator Preparation Profile, Illinois State Board of Education, https://apps.isbe.net/epp/public#/.

From this work, ISBE learned that most users’ experiences had been positive, but that most target audiences outside of IHEs and TPPs were unaware of the tool.

ISBE also learned that, while there was a lot of information in the IEPP that users thought was valuable, there were additional data points they would like access to that were not available. In particular, there was broad interest in including candidate completion rates (not just the number of completers), a metric ISBE was not able to reliably calculate because of inconsistencies in how TPPs determine which of their candidates are officially enrolled. In response, ISBE refined its regulations in summer 2022 to define “candidate enrollment” in a way that allows it to create a standardized cohort of candidates to determine completion rates. IHEs also expressed concerns about the job placement rates for program completers including only public school employment, which ISBE is addressing by finding ways to collect employment data for those teaching in non-public schools and private-but-publicly-funded early childhood settings.

This new research also confirmed what the 2020 stakeholder research had illuminated: each of the three target audiences has different wants and needs, and their perceptions of how well the IEPP can meet these in its current form vary. Because the PEP committee’s primary focus in developing the IEPP was on promoting improvement of the state’s TPPs, ISBE was not surprised to see that the platform did not meet the needs of prospective teachers and K–12 hiring managers as well as it does IHEs. Some of the elements hiring managers want to know are likely outside the purview of a retrospective data platform, as they often are seeking assistance with immediate recruiting needs: “who is available to teach right now?” But other elements, such as the location and types of schools in which candidates complete their student teaching, could be included. Prospective teachers want more practical information about student teaching, including where they might get placed and information about their job prospects (e.g., “how quickly do completers get jobs and what kind of schools do they work in?”).

ISBE continues to respond to stakeholder feedback to enhance the IEPP. One way it plans to do so is by expanding the reporting system to collect additional information in areas such as program cost, LEA-IHE partnerships, and innovative pipeline efforts. The agency is also working to expand awareness about the IEPP tool beyond IHE stakeholders. One way it has done so is by including links to the IEPP in areas of the ISBE website that are frequently accessed by prospective educators, such as its Pathways to Licensure page.

But ISBE’s biggest focus is on increasing the number of metrics that it can automatically populate in the IEPP (that is, without TPPs needing to manually report them). To do this, ISBE is pursuing the potential for creating connections between other state data systems, likely through data-sharing agreements. The agency’s motivation is twofold: doing so would reduce the time required of TPPs to complete their data profile, and, even more importantly, it would enhance data quality by reducing manual entry errors. “Despite it being a heavy lift to put data-sharing agreements in place, the pros outweigh the cons,” said Emily Fox, director of the Educator Effectiveness division at ISBE, which manages the IEPP.

Lessons for Other States

Illinois’s IEPP efforts offer five key lessons for state education leaders interested in developing systems that provide key stakeholders with more useful and accessible teacher preparation program data with a goal of driving improvement.[9]

1) Investigate other states’ successes and challenges and apply what makes sense in your context.

For example, by engaging on this topic with other states, Illinois uncovered ways to minimize preparation programs’ data reporting burden, including by developing data-sharing agreements across the various state agencies that collect relevant metrics.

2) Start with clarity about the primary state objective and primary intended user, and keep those in mind at each decision point. While additional goals and users may be fleshed out in the future, try to avoid mission creep in the early stages of the work.

For example, since the primary audience for the IEPP is IHEs, Illinois developed a more expansive platform view for IHEs than the public view, based on in-depth research it conducted on the needs of this stakeholder group. With the platform fully functional, ISBE is now considering how it can better meet the expressed data desires of teacher candidates and K–12 hiring managers.

3) Select members of the stakeholder advisory group thoughtfully. Reaching beyond the usual suspects can help ensure breadth and diversity of expertise in answering key questions.

While ISBE’s deliberate creation of a broad group of stakeholders was instrumental in the state’s success, it could have benefitted from being even broader. For example, because the PEP committee did not include anyone with early childhood expertise, it was at a loss for how to capture employment data for candidates who were working in early childhood centers. While it found a solution—gathering employment data from the Illinois Department of Employment Services—it could have gotten there more quickly if the committee had included a wider array of expertise.

4) Ensure that a broad set of evidence-based measures are in the mix to help stakeholders with conflicting viewpoints reach consensus.

While various stakeholders in the PEP committee held disparate opinions about the value of including certain data elements in the IEPP performance designations, having numerous, diverse measures seemed to allow all stakeholder viewpoints to be accepted and prevented any one recommended metric from holding up consensus.

5) Start slowly and budget for the fact that even the final product must shift over time based on stakeholder feedback.

Illinois piloted the IEPP system with a substantial subgroup of users, and directly engaged with them to understand how the system did and did not meet their goals and course corrected, as necessary. Even with the platform formally in place, the state has continued to solicit input from key stakeholders at every stage of the process and to iterate based on their feedback.

Conclusions

Illinois’s successful launch of the IEPP platform makes it clear that, with sufficient planning, funding, and stakeholder engagement, states have the ability to collect and report more valuable data on their educator preparation programs. The IEPP goes beyond looking at the requirements of a teacher preparation program to understand candidates’ program completion, licensure, and employment outcomes, including their success in supporting student learning and engagement, and how those outcomes intersect with candidate demographic attributes.

Of course, the IEPP system is not without its shortcomings. While the IEPP offers public users insights on the overall racial/ethnic diversity of a teacher preparation program’s candidates, it does not provide any specific demographic details about candidates and completers, the way that some other states’ TPP data systems do. For example, New Jersey shares the proportion of a program’s completers from a variety of demographic subgroups, and offers further context by displaying them alongside the state’s overall proportion of P–12 students and P–12 teachers from those same demographic groups.

But the open-eyes, open-ears approach and commitment to ongoing improvement that Illinois has taken in developing and launching its teacher preparation data platform is likely to reap benefits. For example, there is already a plan in place to review the indicator minimums and state targets every five years, or sooner if deemed necessary, in order to ensure they are still appropriate.

While still early in its existence, the IEPP is helping bring some teacher preparation issues into clearer view in Illinois. This includes highlighting which programs are helping to supply teachers in the grade and subject areas in highest demand by K–12 schools, and which are serving to promote greater diversity in the teacher workforce. In making this information visible, the platform encourages teacher preparation programs to improve their offerings and provides insights that can inform prospective teachers’ and K–12 hiring managers’ decisions.

As the nation struggles to attract, develop, and retain a sufficiently diverse and qualified teacher workforce, Illinois’s development of the IEPP platform also offers state and federal policymakers an opportunity to rethink how data collection and reporting can incentivize meaningful, positive changes in the teacher pipeline. Investment in robust and linkable longitudinal data systems that can follow an individual’s trajectory from PreK–12 education settings into postsecondary and workforce settings can help states answer critical questions about their teacher workforce, from pre-service preparation and beyond, while minimizing the burden on teacher preparation programs. Federal policymakers can maximize the impact of these efforts by providing financial and legislative support, including by expanding HEA Title II data reporting requirements to incorporate meaningful outcomes for teacher preparation program completers and ensuring consistent and explicit definitions.


Acknowledgments

Thank you to the members of the PEP steering committee and the Illinois State Board of Education who graciously shared their knowledge and experiences with this work, including Michael Allen and Charles Coble of Teacher Preparation Analytics; Emily Fox of the Illinois State Board of Education; Jim O’Connor of Advance Illinois; Meg Towle of North Third; and Brad White of Lewis and Clark Community College, who engaged in a video interview. This brief also benefited from the editorial guidance of New America colleagues Sabrina Detlef and Elena Silva, and from the design and communication support of Mandy Dean and Fabio Murgia. This work was made possible thanks to the generous support of the Bill & Melinda Gates Foundation and the Joyce Foundation. The views expressed in this report are those of the author alone and do not necessarily reflect the views of these foundations.

Notes

[1] For detailed statistics, see “2022–23 School Year Staffing Dashboard,” 2022 School Pulse Panel, Institute of Education Sciences, September 27, 2022, https://ies.ed.gov/schoolsurvey/spp/.

[2] For a review of the prospective teacher data necessary to answer questions about the teacher pipeline relative to the data available, see Tuan D. Nguyen, Chanh B. Lam, and Paul Bruno, Is There a National Teacher Shortage? A Systematic Examination of Reports of Teacher Shortages in the United States, EdWorkingPaper no. 22-631 (Providence, RI: Annenberg Institute at Brown University, August 2022), 21–27, https://doi.org/10.26300/76eq-hj32.

[3] Jeffrey J. Kuenzi, Teacher Preparation Policies and Issues in the Higher Education Act (Washington, DC: Congressional Research Service, November 16, 2018), https://files.eric.ed.gov/fulltext/ED593607.pdf.

[4] ”Illinois Preparation Profile Continuous Improvement and Accountability System,” Partnership for Educator Preparation, Illinois State Board of Education, October, 2020, p. 13, https://www.isbe.net/Documents/IPP-Continuous-Improvement-Accountability-System.pdf.

[5] “Illinois Educator Preparation Profile Continuous Improvement and Accountability System—Reauthorization Guide 2022,” Illinois State Board of Education, https://www.isbe.net/Documents/EPP-Reauthorization-Guide.pdf.

[6] For an overview of lessons gathered from the Schusterman Family Foundation’s state education agency convenings on developing more robust teacher preparation data systems, see Getting to Better Prep: A State Guide for Better Teacher Preparation Data Systems (New York, NY: TNTP, April 2017), https://tntp.org/assets/Getting_to_Better_Prep_09212017.pdf.

[7] The IEPP currently relies on five year’s worth of data for most metrics, although it relied on only three years’ worth during its 2019 launch.

[8] North Third surveyed 226 respondents, primarily teacher candidates, IHE/TPP administrators, K–12 hiring managers, and K–12 counselors. Twelve in-depth interviews and user testing analyses were also conducted, with candidates (3), IHEs (5), and K–12 hiring managers (4).

[9] For states interested in developing a more robust system for collecting and reporting data on TPPs and the novice teacher pipeline, see TNTP’s technical assistance guide, Getting to Better Prep.

Related Topics
Teachers and Leaders