Last week, PARCC CEO Laura Slover announced that PARCC, the multi-state Common Core-aligned assessment system, will be offering its member states new flexibilities next year. Up until now, states were required to purchase PARCC assessments as-is, without altering or adding to content, and were required to use Pearson as their vendor for administration. But starting 2016-2017, states will be able to add to content to PARCC assessments, pick and choose items from a test bank of PARCC test questions to create their own assessments, or purchase blocks of test questions from which to craft an assessment. States will also be able to choose their own vendor to administer the assessment.
This new flexibility is likely a move to attract new customers and retain PARCC member states. The PARCC consortium began with 26 member states in 2010, but states have dropped out at a steady rate ever since—eleven (plus D.C.) administered PARCC tests last year, and only six states and D.C. will give the test at the end of this current school year. In some states PARCC has been the sacrificial lamb given up to appease anti-testing, anti-Common Core, and anti-Federal oversight sentiment (see EdWeek’s analysis here). In others, states have balked at not having complete control over creating their examination, and are opting out to design their own Common Core-aligned assessment. A timely example is Massachusetts, which announced last month that they will be opting out of PARCC to create their own exam (see our recent post on this here). With its membership declining, PARCC, which cost taxpayers roughly $185 million in Race to the Top Grant funds, has to do something to stem the tide of defections and lure new members. But the flexibility PARCC is now offering may run counter to the two advantages it had in the first place: putting in place high standards for all students and providing the opportunity to compare how well students were doing across multiple states.
A shared assessment system puts pressure on states to keep up with one another, and ultimately raises expectations for all.
While these new flexibilities may be good news for the future of PARCC, they aren’t necessarily good news for the future of student achievement. Historically, individual states’ pursuit of high standards and high achievement for all students has been varied. Some states, like Massachusetts, have set the bar high for student achievement, while states like Louisiana taught and assessed students at a much lower standard. The Common Core State Standards were developed to bring all states to a more level playing field, where all students are given the opportunity to learn rigorous content based on high standards. Creating shared assessments aligned to these standards was an integral part of the Common Core design—if we want all states to rise to this standard of excellence, we need a common measure by which to compare their progress. A shared assessment system puts pressure on states to keep up with one another, and ultimately raises expectations for all.
Unfortunately, if states are allowed to set their own bar and pick and choose which elements of the exam to administer, the pressure is officially off. PARCC’s new options give states freedom to lower standards for their students by creating an easier exam. Given the heat on many states for low scores in their first few years of PARCC exams (a dip that education leaders warned was inevitable in the first few years of new standards and new assessments), it would seem awfully tempting to lower the bar and enjoy the resulting boost in student scores. PARCC’s position as national group divorced from any one state’s interests relieves it of this temptation. This is one of PARCC’s greatest advantages, but one that it loses as it relinquishes control over the assessment to states.
Now that each member state will be able to tweak exams, we will no longer be able to easily and accurately compare their progress.
The second of PARCC’s advantages, the ability to compare results across states, will also be lost with these new flexibilities. When member states all took the same assessment, local and national stakeholders were able to compare just how well students in each state were being served by their schools. This was an unprecedented opportunity—other than NAEP, which is only administered in alternating years to 4th and 8th grade students, we have never been able to compare achievement so broadly across the states. Now that each member state will be able to tweak exams, we will no longer be able to easily and accurately compare their progress. This rolls back a significant step towards equalizing outcomes for students by allowing states to continue obfuscating how well they are serving students relative to their peers across the nation.
No doubt PARCC anticipated this particular criticism when it rolled out new options. The consortium’s press release notes that there will be an option for states to purchase blocks of assessment questions, with the supposed advantage that it will allow us to continue comparing achievement across states. This option is certainly preferable to the “item bank” approach, which would allow states to create unique tests that would be very challenging to compare. But given the option to create their own test, how many of the remaining PARCC states will really choose the test-block option? PARCC announced Monday morning that Louisiana has already chosen to opt for what, from their description, would appear to the test-bank approach—PARCC questions will only compromise half of the state exam. Massachusetts announced today that it would also opt for a 'hybrid' approach. And even if a larger group of states opted into these two approaches, the array of options for states create a confusing picture for local and national stakeholders. For experts, there will be added complication to analysis; for students and families, there will be an added layer of difficulty to understanding how their state stacks up.
PARCC and the standards it assesses were exciting and hard-won progress for students, but this move from the PARCC consortium represents a step away from that vision: a shared system of expectations that would hold each state accountable for how well it serves its students. Given the millions of tax dollars spent and political will mustered to make the shift to high-quality common assessments, it’s frustrating to see reform rolled back before it has an opportunity to work.